WorldWideScience

Sample records for modeling software package

  1. dMODELS: A software package for modeling volcanic deformation

    Science.gov (United States)

    Battaglia, Maurizio

    2017-04-01

    dMODELS is a software package that includes the most common source models used to interpret deformation measurements near active volcanic centers. The emphasis is on estimating the parameters of analytical models of deformation by inverting data from the Global Positioning System (GPS), Interferometric Synthetic Aperture Radar (InSAR), tiltmeters and strainmeters. Source models include: (a) pressurized spherical, ellipsoidal and sill-like magma chambers in an elastic, homogeneous, flat half-space; (b) pressurized spherical magma chambers with topography corrections; and (c) the solutions for a dislocation (fracture) in an elastic, homogeneous, flat half-space. All of the equations have been extended to include deformation and strain within the Earth's crust (as opposed to only at the Earth's surface) and verified against finite element models. Although actual volcanic sources are not embedded cavities of simple shape, we assume that these models may reproduce the stress field created by the actual magma intrusion or hydrothermal fluid injection. The dMODELS software employs a nonlinear inversion algorithm to determine the best-fit parameters for the deformation source by searching for the minimum of the cost function χv2 (chi square per degrees of freedom). The non-linear inversion algorithm is a combination of local optimization (interior-point method) and random search. This approach is more efficient for hyper-parameter optimization than trials on a grid. The software has been developed using MATLAB, but compiled versions that can be run using the free MATLAB Compiler Runtime (MCR) module are available for Windows 64-bit operating systems. The MATLAB scripts and compiled files are open source and intended for teaching and research. The software package includes both functions for forward modeling and scripts for data inversion. A software demonstration will be available during the meeting. You are welcome to contact the author at mbattaglia@usgs.gov for

  2. Chinshan living PRA model using NUPRA software package

    International Nuclear Information System (INIS)

    Cheng, S.-K.; Lin, T.-J.

    2004-01-01

    A living probabilistic risk assessment (PRA) model has been established for Chinshan Nuclear Power Station (BWR-4, MARK-I) using NUPRA software package. The core damage frequency due to internal events, seismic events and typhoons are evaluated in this model. The methodology and results considering the recent implementation of the 5th emergency diesel generator and automatic boron injection function are presented. The dominant sequences of this PRA model are discussed, and some possible applications of this living model are proposed. (author)

  3. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  4. Creating a simulation model of software testing using Simulink package

    Directory of Open Access Journals (Sweden)

    V. M. Dubovoi

    2016-12-01

    Full Text Available The determination of the solution model of software testing that allows prediction both the whole process and its specific stages is actual for IT-industry. The article focuses on solving this problem. The aim of the article is prediction the time and improvement the quality of software testing. The analysis of the software testing process shows that it can be attributed to the branched cyclic technological processes because it is cyclical with decision-making on control operations. The investigation uses authors' previous works andsoftware testing process method based on Markov model. The proposed method enables execution the prediction for each software module, which leads to better decision-making of each controlled suboperation of all processes. Simulink simulation model shows implementation and verification of results of proposed technique. Results of the research have practically implemented in the IT-industry.

  5. Nested Cohort - R software package

    Science.gov (United States)

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  6. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  7. Software package for modeling spin-orbit motion in storage rings

    Science.gov (United States)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  8. Recent developments on PLASMAKIN - a software package to model the kinetics in gas discharges

    International Nuclear Information System (INIS)

    Pinhao, N R

    2009-01-01

    PLASMAKIN is a user-friendly software package to handle physical and chemical data used in plasma physics modeling and to compute the production and destruction terms in fluid models equations. These terms account for the particle or energy production and loss rates due to gas-phase and gas-surface reactions. The package has been restructured and expanded to (a) allow the simulation of atomic emission spectra taking into account line broadening processes and radiation trapping; (b) include a library to compute the electron kinetics; (c) include a database of species properties and reactions and, (d) include a Python interface to allow access from scripts and integration with other scientific software tools.

  9. An open-source software package for multivariate modeling and clustering: applications to air quality management.

    Science.gov (United States)

    Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong

    2015-09-01

    This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.

  10. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Dinter, T.; Rozanov, A.V.; Wolanin, A.; Bracher, A.; Burrows, J.P.

    2017-01-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean–atmosphere radiative transfer solver presented by Rozanov et al. we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: (http://www.iup.physik.uni-bremen.de). - Highlights: • A new version of the software package SCIATRAN is presented. • Inelastic scattering in water and atmosphere is implemented in SCIATRAN. • Raman scattering and fluorescence can be included in radiative transfer calculations. • Comparisons to other radiative transfer models show excellent agreement. • Comparisons to observations show consistent results.

  11. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    Science.gov (United States)

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  12. The EQ3/6 software package for geochemical modeling: Current status

    International Nuclear Information System (INIS)

    Worlery, T.J.; Jackson, K.J.; Bourcier, W.L.; Bruton, C.J.; Viani, B.E.; Knauss, K.G.; Delany, J.M.

    1988-07-01

    EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300 degree C. 60 refs., 2 figs

  13. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Science.gov (United States)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  14. Packaging of control system software

    International Nuclear Information System (INIS)

    Zagar, K.; Kobal, M.; Saje, N.; Zagar, A.; Sabjan, R.; Di Maio, F.; Stepanov, D.

    2012-01-01

    Control system software consists of several parts - the core of the control system, drivers for integration of devices, configuration for user interfaces, alarm system, etc. Once the software is developed and configured, it must be installed to computers where it runs. Usually, it is installed on an operating system whose services it needs, and also in some cases dynamically links with the libraries it provides. Operating system can be quite complex itself - for example, a typical Linux distribution consists of several thousand packages. To manage this complexity, we have decided to rely on Red Hat Package Management system (RPM) to package control system software, and also ensure it is properly installed (i.e., that dependencies are also installed, and that scripts are run after installation if any additional actions need to be performed). As dozens of RPM packages need to be prepared, we are reducing the amount of effort and improving consistency between packages through a Maven-based infrastructure that assists in packaging (e.g., automated generation of RPM SPEC files, including automated identification of dependencies). So far, we have used it to package EPICS, Control System Studio (CSS) and several device drivers. We perform extensive testing on Red Hat Enterprise Linux 5.5, but we have also verified that packaging works on CentOS and Scientific Linux. In this article, we describe in greater detail the systematic system of packaging we are using, and its particular application for the ITER CODAC Core System. (authors)

  15. A PC-based software package for modeling DOE mixed-waste management options

    International Nuclear Information System (INIS)

    Abashian, M.S.; Carney, C.; Schum, K.

    1995-02-01

    The U.S. Department of Energy (DOE) Headquarters and associated contractors have developed an IBM PC-based software package that estimates costs, schedules, and public and occupational health risks for a range of mixed-waste management options. A key application of the software package is the comparison of various waste-treatment options documented in the draft Site Treatment Plans prepared in accordance with the requirements of the Federal Facility Compliance Act of 1992. This automated Systems Analysis Methodology consists of a user interface for configuring complexwide or site-specific waste-management options; calculational algorithms for cost, schedule and risk; and user-selected graphical or tabular output of results. The mixed-waste management activities modeled in the automated Systems Analysis Methodology include waste storage, characterization, handling, transportation, treatment, and disposal. Analyses of treatment options identified in the draft Site Treatment Plans suggest potential cost and schedule savings from consolidation of proposed treatment facilities. This paper presents an overview of the automated Systems Analysis Methodology

  16. An Ada Linear-Algebra Software Package Modeled After HAL/S

    Science.gov (United States)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  17. The CASA Software Package

    Science.gov (United States)

    Petry, Dirk

    2018-03-01

    CASA is the standard science data analysis package for ALMA and VLA but it can also be used for the analysis of data from other observatories. In this talk, I will give an overview of the structure and features of CASA, who develops it, and the present status and plans, and then show typical analysis workflows for ALMA data with special emphasis on the handling of single dish data and its combination with interferometric data.

  18. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Directory of Open Access Journals (Sweden)

    Jeffrey K Noel

    2016-03-01

    Full Text Available Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  19. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Science.gov (United States)

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  20. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    International Nuclear Information System (INIS)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker, Charles L. III; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-01-01

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  1. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  2. PIV Data Validation Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  3. Development of the CCP-200 mathematical model for Syzran CHPP using the Thermolib software package

    Science.gov (United States)

    Usov, S. V.; Kudinov, A. A.

    2016-04-01

    Simplified cycle diagram of the CCP-200 power generating unit of Syzran CHPP containing two gas turbines PG6111FA with generators, two steam recovery boilers KUP-110/15-8.0/0.7-540/200, and one steam turbine Siemens SST-600 (one-cylinder with two variable heat extraction units of 60/75 MW in heatextraction and condensing modes, accordingly) with S-GEN5-100 generators was presented. Results of experimental guarantee tests of the CCP-200 steam-gas unit are given. Brief description of the Thermolib application for the MatLab Simulink software package is given. Basic equations used in Thermolib for modeling thermo-technical processes are given. Mathematical models of gas-turbine plant, heat-recovery steam generator, steam turbine and integrated plant for power generating unit CCP-200 of Syzran CHPP were developed with the help of MatLab Simulink and Thermolib. The simulation technique at different ambient temperature values was used in order to get characteristics of the developed mathematical model. Graphic comparison of some characteristics of the CCP-200 simulation model (gas temperature behind gas turbine, gas turbine and combined cycle plant capacity, high and low pressure steam consumption and feed water consumption for high and low pressure economizers) with actual characteristics of the steam-gas unit received at experimental (field) guarantee tests at different ambient temperature are shown. It is shown that the chosen degrees of complexity, characteristics of the CCP-200 simulation model, developed by Thermolib, adequately correspond to the actual characteristics of the steam-gas unit received at experimental (field) guarantee tests; this allows considering the developed mathematical model as adequate and acceptable it for further work.

  4. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  5. The last developments of the airGR R-package, an open source software for rainfall-runoff modelling

    Science.gov (United States)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Perrin, Charles; Andréassian, Vazken

    2017-04-01

    and usability of this tool. References Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  6. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    Science.gov (United States)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen

  7. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    Science.gov (United States)

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  8. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "In......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  9. Software Package \\Nesvetay-3D" for modeling three-dimensional flows of monatomic rarefied gas

    Directory of Open Access Journals (Sweden)

    V. A. Titarev

    2014-01-01

    Full Text Available Analysis of three-dimensional rarefied gas flowsin microdevices (micropipes, micropumps etc and over re-entry vehicles requires development of methods of computational modelling. One of such methods is the direct numerical solution of the Boltzmann kinetic equation for the velocity distribution function with either exact or approximate (model collision integral. At present, for flows of monatomic rarefied gas the Shakhov model kinetic equation, also called S-model, has gained wide-spread use. The equation can be regarded as a model equation of the incomplete thirdorder approximation. Despite its relative simplicity, the S-model is still a complicated integrodifferential equation of high dimension. The numerical solution of such an equation requires high-accuracy parallel methods.The present work is a review of recent results concerning the development and application of three-dimensional computer package Nesvetay-3D intended for modelling of rarefied gas flows. The package solves Boltzmann kinetic equation with the BGK (Krook and Shakhov model collision integrals using the discrete velocity approach. Calculations are carried out in non-dimensional variables. A finite integration domain and a mesh are introduced in the molecular velocity space. Next, the kinetic equation is re-written as a system of kinetic equations for each of the discrete velocities. The system is solved using an implicit finite-volume method of Godunov type. The steady-state solution is computed by a time marching method. High order of spatial accuracy is achieved by using a piece-wise linear representation of the distribution function in each spatial cell. In general, the coefficients of such an approximation are found using the least-square method. Arbitrary unstructured meshes in the physical space can be used in calculations, which allow considering flows over objects of general geometrical shape. Conservative property of the method with respect to the model collision

  10. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  11. Intercomparison of PIXE spectrometry software packages

    International Nuclear Information System (INIS)

    2003-02-01

    During the year 2000, an exercise was organized to make a intercomparison of widely available software packages for analysis of particle induced X ray emission (PIXE) spectra. This TECDOC describes the method used in this intercomparison exercise and presents the results obtained. It also gives a general overview of the participating software packages. This includes basic information on their user interface, graphical presentation capabilities, physical phenomena taken in account, way of presenting results, etc. No recommendation for a particular software package or method for spectrum analysis is given. It is intended that the readers reach their own conclusions and make their own choices, according to their specific needs. This TECDOC will be useful to anyone involved in PIXE spectrum analysis. This TECDOC includes a companion CD with the complete set of test spectra used for intercomparison. The test spectra on this CD can be used to test any PIXE spectral analysis software package

  12. Kwant: a software package for quantum transport

    International Nuclear Information System (INIS)

    Groth, Christoph W; Waintal, Xavier; Wimmer, Michael; Akhmerov, Anton R

    2014-01-01

    Kwant is a Python package for numerical quantum transport calculations. It aims to be a user-friendly, universal, and high-performance toolbox for the simulation of physical systems of any dimensionality and geometry that can be described by a tight-binding model. Kwant has been designed such that the natural concepts of the theory of quantum transport (lattices, symmetries, electrodes, orbital/spin/electron-hole degrees of freedom) are exposed in a simple and transparent way. Defining a new simulation setup is very similar to describing the corresponding mathematical model. Kwant offers direct support for calculations of transport properties (conductance, noise, scattering matrix), dispersion relations, modes, wave functions, various Green's functions, and out-of-equilibrium local quantities. Other computations involving tight-binding Hamiltonians can be implemented easily thanks to its extensible and modular nature. Kwant is free software available at http://kwant-project.org/. (paper)

  13. Software package as an information center product

    International Nuclear Information System (INIS)

    Butler, M.K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables

  14. RPC Stereo Processor (rsp) - a Software Package for Digital Surface Model and Orthophoto Generation from Satellite Stereo Imagery

    Science.gov (United States)

    Qin, R.

    2016-06-01

    Large-scale Digital Surface Models (DSM) are very useful for many geoscience and urban applications. Recently developed dense image matching methods have popularized the use of image-based very high resolution DSM. Many commercial/public tools that implement matching methods are available for perspective images, but there are rare handy tools for satellite stereo images. In this paper, a software package, RPC (rational polynomial coefficient) stereo processor (RSP), is introduced for this purpose. RSP implements a full pipeline of DSM and orthophoto generation based on RPC modelled satellite imagery (level 1+), including level 2 rectification, geo-referencing, point cloud generation, pan-sharpen, DSM resampling and ortho-rectification. A modified hierarchical semi-global matching method is used as the current matching strategy. Due to its high memory efficiency and optimized implementation, RSP can be used in normal PC to produce large format DSM and orthophotos. This tool was developed for internal use, and may be acquired by researchers for academic and non-commercial purpose to promote the 3D remote sensing applications.

  15. SPADE - software package to aid diffraction experiments

    International Nuclear Information System (INIS)

    Farren, J.; Giltrap, J.W.

    1978-10-01

    A software package is described which enables the DEC PDP-11/03 microcomputer to execute several different X-ray diffraction experiments and other similar experiments where stepper motors are driven and data is gathered and processed in real time. (author)

  16. Consys Linear Control System Design Software Package

    International Nuclear Information System (INIS)

    Diamantidis, Z.

    1987-01-01

    This package is created in order to help engineers, researchers, students and all who work on linear control systems. The software includes all time and frequency domain analysises, spectral analysises and networks, active filters and regulators design aids. The programmes are written on Hewlett Packard computer in Basic 4.0

  17. Tracked Vehicle Dynamics Modeling and Simulation Methodology, with Control, using RecurDyn Software Package

    Science.gov (United States)

    2011-09-01

    the suspension, and finally the chassis . This allows for the lower-level elements to be simulated as they are complete, which greatly eases model...1988. For The U.S. Army Tank Automotive Command. UMTRI-88-53. 14. Application of an Optimal Preview Control for Simulation of Closed-Loop Automobile

  18. Software package r3t. Model for transport and retention in porous media. Final report

    International Nuclear Information System (INIS)

    Fein, E.

    2004-01-01

    In long-termsafety analyses for final repositories for hazardous wastes in deep geological formations the impact to the biosphere due to potential release of hazardous materials is assessed for relevant scenarios. The model for migration of wastes from repositories to men is divided into three almost independent parts: the near field, the geosphere, and the biosphere. With the development of r 3 t the feasibility to model the pollutant transport through the geosphere for porous or equivalent porous media in large, three-dimensional, and complex regions is established. Furthermore one has at present the ability to consider all relevant retention and interaction effects which are important for long-term safety analyses. These are equilibrium sorption, kinetically controlled sorption, diffusion into immobile pore waters, and precipitation. The processes of complexing, colloidal transport and matrix diffusion may be considered at least approximately by skilful choice of parameters. Speciation is not part of the very recently developed computer code r 3 t. With r 3 t it is possible to assess the potential dilution and the barrier impact of the overburden close to reality

  19. pyLIMA: An Open-source Package for Microlensing Modeling. I. Presentation of the Software and Analysis of Single-lens Models

    Science.gov (United States)

    Bachelet, E.; Norbury, M.; Bozza, V.; Street, R.

    2017-11-01

    Microlensing is a unique tool, capable of detecting the “cold” planets between ˜1 and 10 au from their host stars and even unbound “free-floating” planets. This regime has been poorly sampled to date owing to the limitations of alternative planet-finding methods, but a watershed in discoveries is anticipated in the near future thanks to the planned microlensing surveys of WFIRST-AFTA and Euclid's Extended Mission. Of the many challenges inherent in these missions, the modeling of microlensing events will be of primary importance, yet it is often time-consuming, complex, and perceived as a daunting barrier to participation in the field. The large scale of future survey data products will require thorough but efficient modeling software, but, unlike other areas of exoplanet research, microlensing currently lacks a publicly available, well-documented package to conduct this type of analysis. We present version 1.0 of the python Lightcurve Identification and Microlensing Analysis (pyLIMA). This software is written in Python and uses existing packages as much as possible to make it widely accessible. In this paper, we describe the overall architecture of the software and the core modules for modeling single-lens events. To verify the performance of this software, we use it to model both real data sets from events published in the literature and generated test data produced using pyLIMA's simulation module. The results demonstrate that pyLIMA is an efficient tool for microlensing modeling. We will expand pyLIMA to consider more complex phenomena in the following papers.

  20. Software Package for Optics Measurement and Correction in the LHC

    CERN Document Server

    Aiba, M; Tomas, R; Vanbavinckhove, G

    2010-01-01

    A software package has been developed for the LHC on-line optics measurement and correction. This package includes several different algorithms to measure phase advance, beta functions, dispersion, coupling parameters and even some non-linear terms. A Graphical User Interface provides visualization tools to compare measurements to model predictions, fit analytical formula, localize error sources and compute and send corrections to the hardware.

  1. Human-machine interface software package

    International Nuclear Information System (INIS)

    Liu, D.K.; Zhang, C.Z.

    1992-01-01

    The Man-Machine Interface software Package (MMISP) is designed to configure the console software of PLS 60 Mev LINAC control system. The control system of PLS 60 Mev LINAC is a distributed control system which includes the main computer (Intel 310) four local station, and two sets of industrial level console computer. The MMISP provides the operator with the display page editor, various I/O configuration such as digital signals In/Out, analog signal In/Out, waveform TV graphic display, and interactive with operator through graphic picture display, voice explanation, and touch panel. This paper describes its function and application. (author)

  2. Using a general purpose spreadsheet software package to estimate ...

    African Journals Online (AJOL)

    The objective of this analysis was to evaluate the accuracy of a standard spreadsheet software package to estimate best-fit parameters for an exponential plus constant model (y=a+b.e cx) applied to blood lactate concentration versus work rate data. During an incremental cycle test, blood lactate concentrations were ...

  3. Current status and future direction of the MONK software package

    International Nuclear Information System (INIS)

    Smith, Nigel; Armishaw, Malcolm; Cooper, Andrew

    2003-01-01

    The current status of the MONK criticality software package is summarized in terms of recent and current developments and envisaged directions for the future. The areas of the discussion are physics modeling, geometry modeling, source modeling, nuclear data, validation, supporting tools and customer services. In future development plan, MONK continues to be focused on meeting the short and long-term needs of the code user community. (J.P.N.)

  4. Information technologies and software packages for education of specialists in materials science [In Russian

    NARCIS (Netherlands)

    Krzhizhanovskaya, V.; Ryaboshuk, S.

    2009-01-01

    This paper presents methodological materials, interactive text-books and software packages developed and extensively used for education of specialists in materials science. These virtual laboratories for education and research are equipped with tutorials and software environment for modeling complex

  5. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  6. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  7. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Energy Technology Data Exchange (ETDEWEB)

    Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-02-17

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  8. An Assessment of the Library Application Software Packages in ...

    African Journals Online (AJOL)

    An Assessment of the Library Application Software Packages in Selected Academic Libraries In Nigeria. ... The study found that most application packages available in the Nigerian automation market place are effective since they are usually packages that have been fully tested, thoroughly de-bugged and well supported by ...

  9. Adoption of open source digital library software packages: a survey

    OpenAIRE

    Jose, Sanjo

    2007-01-01

    Open source digital library packages are gaining popularity nowadays. To build a digital library under economical conditions open source software is preferable. This paper tries to identify the extent of adoption of open source digital library software packages in various organizations through an online survey. It lays down the findings from the survey.

  10. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model.

    Energy Technology Data Exchange (ETDEWEB)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-03-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  11. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model

    International Nuclear Information System (INIS)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-01-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  12. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    Science.gov (United States)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  13. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  14. International Inventory of Software Packages in the Information Field.

    Science.gov (United States)

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  15. Software package for analysis of completely randomized block design

    African Journals Online (AJOL)

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  16. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Directory of Open Access Journals (Sweden)

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  17. Interactive Visualization of Assessment Data: The Software Package Mondrian

    Science.gov (United States)

    Unlu, Ali; Sargin, Anatol

    2009-01-01

    Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…

  18. The experiential modification of a computer software package for ...

    African Journals Online (AJOL)

    Erna Kinsey

    The experiential modification of a computer software package for graphing algebraic functions. J.G. Maree, S. Scholtz,* H.J. Botha and S. Van Putten. School of Teacher Training, Faculty of Education, University of Pretoria, Pretoria, 0002 South Africa. * To whom correspondence should be addressed. Graphing software and ...

  19. Library Automation Software Packages used in Academic Libraries of Nepal

    OpenAIRE

    Sharma (Baral), Sabitri

    2007-01-01

    This thesis presents a comparative assessment of the library automation software packages used in Nepalese academic libraries. It focuses on the evaluation of software on the basis of certain important checkpoints. It also highlights the importance of library automation, library activities and services.

  20. Radiative transfer through terrestrial atmosphere and ocean: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Rozanov, A.V.; Kokhanovsky, A.A.; Burrows, J.P.

    2014-01-01

    SCIATRAN is a comprehensive software package for the modeling of radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40μm) including multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The software is capable of modeling spectral and angular distributions of the intensity or the Stokes vector of the transmitted, scattered, reflected, and emitted radiation assuming either a plane-parallel or a spherical atmosphere. Simulations are done either in the scalar or in the vector mode (i.e. accounting for the polarization) for observations by space-, air-, ship- and balloon-borne, ground-based, and underwater instruments in various viewing geometries (nadir, off-nadir, limb, occultation, zenith-sky, off-axis). All significant radiative transfer processes are accounted for. These are, e.g. the Rayleigh scattering, scattering by aerosol and cloud particles, absorption by gaseous components, and bidirectional reflection by an underlying surface including Fresnel reflection from a flat or roughened ocean surface. The software package contains several radiative transfer solvers including finite difference and discrete-ordinate techniques, an extensive database, and a specific module for solving inverse problems. In contrast to many other radiative transfer codes, SCIATRAN incorporates an efficient approach to calculate the so-called Jacobians, i.e. derivatives of the intensity with respect to various atmospheric and surface parameters. In this paper we discuss numerical methods used in SCIATRAN to solve the scalar and vector radiative transfer equation, describe databases of atmospheric, oceanic, and surface parameters incorporated in SCIATRAN, and demonstrate how to solve some selected radiative transfer problems using the SCIATRAN package. During the last decades, a lot of studies have been published demonstrating that SCIATRAN is a valuable

  1. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  2. GPS Software Packages Deliver Positioning Solutions

    Science.gov (United States)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  3. Software package r{sup 3}t. Model for transport and retention in porous media. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Fein, E. (ed.)

    2004-07-01

    In long-termsafety analyses for final repositories for hazardous wastes in deep geological formations the impact to the biosphere due to potential release of hazardous materials is assessed for relevant scenarios. The model for migration of wastes from repositories to men is divided into three almost independent parts: the near field, the geosphere, and the biosphere. With the development of r{sup 3}t the feasibility to model the pollutant transport through the geosphere for porous or equivalent porous media in large, three-dimensional, and complex regions is established. Furthermore one has at present the ability to consider all relevant retention and interaction effects which are important for long-term safety analyses. These are equilibrium sorption, kinetically controlled sorption, diffusion into immobile pore waters, and precipitation. The processes of complexing, colloidal transport and matrix diffusion may be considered at least approximately by skilful choice of parameters. Speciation is not part of the very recently developed computer code r{sup 3}t. With r{sup 3}t it is possible to assess the potential dilution and the barrier impact of the overburden close to reality.

  4. Software Packages to Support Electrical Engineering Virtual Lab

    Directory of Open Access Journals (Sweden)

    Manuel Travassos Valdez

    2012-03-01

    Full Text Available The use of Virtual Reality Systems (VRS, as a learning aid, encourages the creation of tools that allow users/students to simulate educational environments on a computer. This article presents a way of building a VRS system with Software Packages to support Electrical Engineering Virtual Laboratories to be used in a near future in the teaching of the curriculum unit of Circuit Theory. The steps required for the construction of a project are presented in this paper. The simulation is still under construction and intends to use a three-dimensional virtual environment laboratory electric measurement, which will allow users/students to experiment and test the modeled equipment. Therefore, there are still no links available for further examination. The result may demonstrate the future potential of applications of Virtual Reality Systems as an efficient and cost-effective learning system.

  5. An interactive software package for validating satellite data

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Pankajakshan, T.

    -6538 versión on-line Gayana (Concepc.) v.68 n.2 supl.TIIProc Concepción 2004 Como citar este artículo Gayana 68(2): 411-419, 2004 AN INTERACTIVE SOFTWARE PACKAGE FOR VALIDATING SATELLITE DATA P.M.Muraleedharan & T. Pankajakshan... (Concepción) - AN INTERACTIVE SOFTWARE PACKAGE FOR VAL... 8/11/2006http://www.scielo.cl/scielo.php?script=sci_arttext&pid=S0717-65382004000300018&lng=... © 2006 Universidad de Concepción. Facultad de Ciencias Naturales y Oceanográficas. Casilla 160-C...

  6. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  7. UES: an optimization software package for power and energy

    International Nuclear Information System (INIS)

    Vohryzek, J.; Havlena, V.; Findejs, J.; Jech, J.

    2004-01-01

    Unified Energy Solutions components are designed to meet specific requirements of the electric utilities, industrial power units, and district heating (combined heat and power) plants. The optimization objective is to operate the plant with maximum process efficiency and operational profit under the constraints imposed by technology and environmental impacts. Software applications for advanced control real-time optimization may provide a low-cost, high return alternative to expensive boiler retrofits for improving operational profit as well as reducing emissions. Unified Energy Solutions (UES) software package is a portfolio of advanced control and optimization components running on top of the standard process regulatory and control system. The objective of the UES is to operate the plant with maximum achievable profit (maximum efficiency) under the constraints imposed by technology (life-time consumption, asset health) and environmental impacts (CO and NO x emissions). Fast responsiveness to varying economic conditions and integration of real-time optimization and operator decision support (off-line) features are critical for operation in real-time economy. Optimization Features are targeted to combustion process, heat and power load allocation to parallel resources, electric power delivery and ancillary services. Optimization Criteria include increased boiler thermal efficiency, maintaining emission limits, economic load allocation of the heat and generation sources. State-of-the-art advanced control algorithms use model based predictive control principles and provide superior response in transient states. Individual software modules support open control platforms and communication protocols. UES can be implemented on a wide range of distributed control systems. Typical achievable benefits include heat and power production costs savings, increased effective boiler operation range, optimized flue gas emissions, optimized production capacity utilization, optimized

  8. Comparison of four software packages applied to a scattering problem

    DEFF Research Database (Denmark)

    Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren

    1999-01-01

    We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...... stochastic and interval arithmetic, respectively) provide an effortless validation of the numerical results obtained in computer simulation, whereby we get a reliable account of the limitations of the numerical methods. (C) 1999 IMACS/Elsevier Science B.V. Ail rights reserved....

  9. LS-VISM: A software package for analysis of biomolecular solvation.

    Science.gov (United States)

    Zhou, Shenggao; Cheng, Li-Tien; Sun, Hui; Che, Jianwei; Dzubiella, Joachim; Li, Bo; McCammon, J Andrew

    2015-05-30

    We introduce a software package for the analysis of biomolecular solvation. The package collects computer codes that implement numerical methods for a variational implicit-solvent model (VISM). The input of the package includes the atomic data of biomolecules under consideration and the macroscopic parameters such as solute-solvent surface tension, bulk solvent density and ionic concentrations, and the dielectric coefficients. The output includes estimated solvation free energies and optimal macroscopic solute-solvent interfaces that are obtained by minimizing the VISM solvation free-energy functional among all possible solute-solvent interfaces enclosing the solute atoms. We review the VISM with various descriptions of electrostatics. We also review our numerical methods that consist mainly of the level-set method for relaxing the VISM free-energy functional and a compact coupling interface method for the dielectric Poisson-Boltzmann equation. Such numerical methods and algorithms constitute the central modules of the software package. We detail the structure of the package, format of input and output files, workflow of the codes, and the postprocessing of output data. Our demo application to a host-guest system illustrates how to use the package to perform solvation analysis for biomolecules, including ligand-receptor binding systems. The package is simple and flexible with respect to minimum adjustable parameters and a wide range of applications. Future extensions of the package use can include the efficient identification of ligand binding pockets on protein surfaces. © 2015 Wiley Periodicals, Inc.

  10. TI Workbench, an integrated software package for electrophysiology and imaging.

    Science.gov (United States)

    Inoue, Takafumi

    2018-03-15

    TI Workbench is a software package that serves as a control and analysis center for cellular imaging and electrophysiological experiments. It is unique among general-purpose software packages where it integrates the control of cellular imaging and electrophysiological devices, as well as sophisticated data analyses, which provides superior usability in imaging experiments combined with electrophysiology. During the development over the last 20 years, the range of supported image acquisition devices has expanded from cooled charge-coupled device (CCD) cameras to multi-photon microscope systems. In this review, I outline the concept of TI Workbench together with its unique functions and features derived from ideas emerging during daily experiments in my own lab and in those of my collaborators over the last 20 years. TI Workbench includes standard functions required for time-lapse multicolor fluorescence imaging and electrophysiological experiments, in addition to specialized functions such as random-scan or conventional raster-scan two-photon microscopy packages and fluorescence life time imaging (FLIM) utilities. Data analysis modules, e.g. digital data filters for temporal waveforms of time-lapse image data and electrophysiology and for 2-D image data, and fluorescence correlation spectroscopy (FCS) analysis functions, are well integrated with data acquisition functions. A notebook function holds formatted text, graphs, image and movie data altogether, which are linked to the actual data files. TI Workbench uses Igor Pro software as a back-end output for publishing. In addition, TI Workbench imports several different formats of image and electrophysiology data, serving as a general-purpose data analysis software package.

  11. AutoPAVER: A software package for automated pavement evaluation

    Science.gov (United States)

    Ginsberg, Mark D.; Shahin, M. Y.; Walther, Jeanette A.

    1990-07-01

    This research developed a method that improves data collection and reduces data entry times for Pavement Condition Index (PCI) surveys for use with PAVER, a pavement maintenance management system. The method, AutoPAVER, is a microcomputer software package used to analyze pictures of pavement surfaces and to forward the resulting analysis to PAVER. The user works interactively with the system to identify and classify pavement distresses. Distress measurement and data entry are done on the computer.

  12. Maximize Your Investment 10 Key Strategies for Effective Packaged Software Implementations

    CERN Document Server

    Beaubouef, Grady Brett

    2009-01-01

    This is a handbook covering ten principles for packaged software implementations that project managers, business owners, and IT developers should pay attention to. The book also has practical real-world coverage including a sample agenda for conducting business solution modeling, customer case studies, and a road map to implement guiding principles. This book is aimed at enterprise architects, development leads, project managers, business systems analysts, business systems owners, and anyone who wants to implement packaged software effectively. If you are a customer looking to implement COTS s

  13. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  14. Quantitative evaluation of software packages for single-molecule localization microscopy.

    Science.gov (United States)

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  15. PINT, A Modern Software Package for Pulsar Timing

    Science.gov (United States)

    Luo, Jing; Ransom, Scott M.; Demorest, Paul; Ray, Paul S.; Stovall, Kevin; Jenet, Fredrick; Ellis, Justin; van Haasteren, Rutger; Bachetti, Matteo; NANOGrav PINT developer team

    2018-01-01

    Pulsar timing, first developed decades ago, has provided an extremely wide range of knowledge about our universe. It has been responsible for many important discoveries, such as the discovery of the first exoplanet and the orbital period decay of double neutron star systems. Currently pulsar timing is the leading technique for detecting low frequency (about 10^-9 Hertz) gravitational waves (GW) using an array of pulsars as the detectors. To achieve this goal, high precision pulsar timing data, at about nanoseconds level, is required. Most high precision pulsar timing data are analyzed using the widely adopted software TEMPO/TEMPO2. But for a robust and believable GW detection, it is important to have independent software that can cross-check the result. In this poster we present the new generation pulsar timing software PINT. This package will provide a robust system to cross check high-precision timing results, completely independent of TEMPO and TEMPO2. In addition, PINT is designed to be a package that is easy to extend and modify, through use of flexible code architecture and a modern programming language, Python, with modern technology and libraries.

  16. Eval: A software package for analysis of genome annotations

    Directory of Open Access Journals (Sweden)

    Brent Michael R

    2003-10-01

    Full Text Available Abstract Summary Eval is a flexible tool for analyzing the performance of gene annotation systems. It provides summaries and graphical distributions for many descriptive statistics about any set of annotations, regardless of their source. It also compares sets of predictions to standard annotations and to one another. Input is in the standard Gene Transfer Format (GTF. Eval can be run interactively or via the command line, in which case output options include easily parsable tab-delimited files. Availability To obtain the module package with documentation, go to http://genes.cse.wustl.edu/ and follow links for Resources, then Software. Please contact brent@cse.wustl.edu

  17. FRAMES Software System: Linking to the Statistical Package R

    Energy Technology Data Exchange (ETDEWEB)

    Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.

    2006-12-11

    This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.

  18. Evaluating software architecture using fuzzy formal models

    Directory of Open Access Journals (Sweden)

    Payman Behbahaninejad

    2012-04-01

    Full Text Available Unified Modeling Language (UML has been recognized as one of the most popular techniques to describe static and dynamic aspects of software systems. One of the primary issues in designing software packages is the existence of uncertainty associated with such models. Fuzzy-UML to describe software architecture has both static and dynamic perspective, simultaneously. The evaluation of software architecture design phase initiates always help us find some additional requirements, which helps reduce cost of design. In this paper, we use a fuzzy data model to describe the static aspects of software architecture and the fuzzy sequence diagram to illustrate the dynamic aspects of software architecture. We also transform these diagrams into Petri Nets and evaluate reliability of the architecture. The web-based hotel reservation system for further explanation has been studied.

  19. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Science.gov (United States)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  1. Strategy and Software Application of Fresh Produce Package Design to Attain Optimal Modified Atmosphere

    Directory of Open Access Journals (Sweden)

    Dong Sun Lee

    2014-01-01

    Full Text Available Modified atmosphere packaging of fresh produce relies on the attainment of desired gas concentration inside the package resulting from product respiration and package’s gas transfer. Systematic package design method to achieve the target modified atmosphere was developed and constructed as software in terms of selecting the most appropriate film, microperforations, and/or CO2 scavenger. It incorporates modeling and/or database construction on the produce respiration, gas transfer across the plastic film and microperforation, and CO2 absorption by the scavenger. The optimization algorithm first selects the packaging film and/or microperforations to have the target O2 concentration in response to the respiration and then tunes the CO2 concentration by CO2 absorber when it goes above its tolerance limit. The optimization method tested for green pepper, strawberry, and king oyster mushroom packages was shown to be effective to design the package and the results obtained were consistent with literature work and experimental atmosphere.

  2. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  3. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  4. Can I Trust This Software Package? An Exercise in Validation of Computational Results

    Science.gov (United States)

    Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.

    2008-01-01

    Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…

  5. Strategic Business-IT alignment of application software packages: Bridging the Information Technology gap

    Directory of Open Access Journals (Sweden)

    Wandi Kruger

    2012-09-01

    Full Text Available An application software package implementation is a complex endeavour, and as such it requires the proper understanding, evaluation and redefining of the current business processes to ensure that the implementation delivers on the objectives set at the start of the project. Numerous factors exist that may contribute to the unsuccessful implementation of application software packages. However, the most significant contributor to the failure of an application software package implementation lies in the misalignment of the organisation’s business processes with the functionality of the application software package. Misalignment is attributed to a gap that exists between the business processes of an organisation and what functionality the application software package has to offer to translate the business processes of an organisation into digital form when implementing and configuring an application software package. This gap is commonly referred to as the information technology (IT gap. This study proposes to define and discuss the IT gap. Furthermore this study will make recommendations for aligning the business processes with the functionality of the application software package (addressing the IT gap. The end result of adopting these recommendations will be more successful application software package implementations.

  6. IDES: Interactive Data Entry System: a generalized data acquisition software package

    International Nuclear Information System (INIS)

    Gasser, S.B.

    1980-04-01

    The Interactive Data Entry System (IDES) is a software package which greatly assists in designing and storing forms to be used for the directed acquisition of data. Objective of this package is to provide a viable man/machine interface to any comprehensive data base. This report provides a technical description of the software and can be used as a user's manual

  7. CDIAC catalog of numeric data packages and computer model packages

    International Nuclear Information System (INIS)

    Boden, T.A.; Stoss, F.W.

    1993-05-01

    The Carbon Dioxide Information Analysis Center acquires, quality-assures, and distributes to the scientific community numeric data packages (NDPs) and computer model packages (CMPs) dealing with topics related to atmospheric trace-gas concentrations and global climate change. These packages include data on historic and present atmospheric CO 2 and CH 4 concentrations, historic and present oceanic CO 2 concentrations, historic weather and climate around the world, sea-level rise, storm occurrences, volcanic dust in the atmosphere, sources of atmospheric CO 2 , plants' response to elevated CO 2 levels, sunspot occurrences, and many other indicators of, contributors to, or components of climate change. This catalog describes the packages presently offered by CDIAC, reviews the processes used by CDIAC to assure the quality of the data contained in these packages, notes the media on which each package is available, describes the documentation that accompanies each package, and provides ordering information. Numeric data are available in the printed NDPs and CMPs, in CD-ROM format, and from an anonymous FTP area via Internet. All CDIAC information products are available at no cost

  8. US Army Radiological Bioassay and Dosimetry: The RBD software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.; Ward, R.C.; Maddox, L.B.

    1993-01-01

    The RBD (Radiological Bioassay and Dosimetry) software package was developed for the U. S. Army Material Command, Arlington, Virginia, to demonstrate compliance with the radiation protection guidance 10 CFR Part 20 (ref. 1). Designed to be run interactively on an IBM-compatible personal computer, RBD consists of a data base module to manage bioassay data and a computational module that incorporates algorithms for estimating radionuclide intake from either acute or chronic exposures based on measurement of the worker's rate of excretion of the radionuclide or the retained activity in the body. In estimating the intake,RBD uses a separate file for each radionuclide containing parametric representations of the retention and excretion functions. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent. For a given nuclide, if measurements exist for more than one type of assay, an auxiliary module, REPORT, estimates the intake by applying weights assigned in the nuclide file for each assay. Bioassay data and computed results (estimates of intake and committed dose equivalent) are stored in separate data bases, and the bioassay measurements used to compute a given result can be identified. The REPORT module creates a file containing committed effective dose equivalent for each individual that can be combined with the individual's external exposure

  9. US Army Radiological Bioassay and Dosimetry: The RBD software package

    Energy Technology Data Exchange (ETDEWEB)

    Eckerman, K. F.; Ward, R. C.; Maddox, L. B.

    1993-01-01

    The RBD (Radiological Bioassay and Dosimetry) software package was developed for the U. S. Army Material Command, Arlington, Virginia, to demonstrate compliance with the radiation protection guidance 10 CFR Part 20 (ref. 1). Designed to be run interactively on an IBM-compatible personal computer, RBD consists of a data base module to manage bioassay data and a computational module that incorporates algorithms for estimating radionuclide intake from either acute or chronic exposures based on measurement of the worker's rate of excretion of the radionuclide or the retained activity in the body. In estimating the intake,RBD uses a separate file for each radionuclide containing parametric representations of the retention and excretion functions. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent. For a given nuclide, if measurements exist for more than one type of assay, an auxiliary module, REPORT, estimates the intake by applying weights assigned in the nuclide file for each assay. Bioassay data and computed results (estimates of intake and committed dose equivalent) are stored in separate data bases, and the bioassay measurements used to compute a given result can be identified. The REPORT module creates a file containing committed effective dose equivalent for each individual that can be combined with the individual's external exposure.

  10. Software package for integrated data processing for internal dose assessment in nuclear medicine (SPRIND).

    Science.gov (United States)

    Visser, Eric; Postema, Ernst; Boerman, Otto; Visschers, Jeroen; Oyen, Wim; Corstens, Frans

    2007-03-01

    Internal radiation dose calculations are normally carried out using the Medical Internal Radiation Dose (MIRD) schema. This requires residence times of radiopharmaceutical activity and S-values for all organs of interest. Residence times can be obtained by quantitative nuclear imaging modalities. For dealing with S-values, the freeware packages MIRDOSE and, more recently, OLINDA/EXM are available. However, these software packages do not calculate residence times from image data. For this purpose, we developed an IDL-based software package for integrated data processing for internal dose assessment in nuclear medicine (SPRIND). SPRIND allows reading and viewing of planar whole-body scintigrams. Organ and background regions of interest (ROIs) can be drawn and are automatically mirrored from the anterior to the posterior view. ROI statistics are used to obtain anterior-posterior averaged counts for each organ, corrected for background activity and attenuation. Residence times for each organ are calculated based on effective decay. The total body biological half-time is calculated for use in the voiding bladder model. Red bone marrow absorbed dose can be calculated using bone regions in the scintigrams or by a blood-derived method. Finally, the results are written to a file in MIRDOSE-OLINDA/EXM format. Using scintigrams in DICOM, the complete analysis is gamma camera vendor independent, and can be performed on any computer using an IDL virtual machine. SPRIND is an easy-to-use software package for radiation dose assessment studies. It has made these studies less time consuming and less error prone.

  11. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  12. User’s Manual for the Simulation of Energy Consumption and Emissions from Rail Traffic Software Package

    DEFF Research Database (Denmark)

    Cordiero, Tiago M.; Lindgreen, Erik Bjørn Grønning; Sorenson, Spencer C

    2005-01-01

    The ARTEMIS rail emissions model was implemented in a Microsoft Excel software package that includes data from the GISCO database on railway traffic. This report is the user’s manual for the aforementioned software that includes information on how to run the program and an overview on how the pro...... of Excel Macros (Visual Basic) and database sheets included in one Excel file......The ARTEMIS rail emissions model was implemented in a Microsoft Excel software package that includes data from the GISCO database on railway traffic. This report is the user’s manual for the aforementioned software that includes information on how to run the program and an overview on how...

  13. Installing python software packages : the good, the bad and the ugly.

    Energy Technology Data Exchange (ETDEWEB)

    Hart, William Eugene

    2010-11-01

    These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly: the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.

  14. ATK-ForceField: a new generation molecular dynamics software package

    Science.gov (United States)

    Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt

    2017-12-01

    ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.

  15. XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.

    Science.gov (United States)

    Ching, Daniel J; Gürsoy, Dogˇa

    2017-03-01

    The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  16. WinTRAX: A raytracing software package for the design of multipole focusing systems

    Science.gov (United States)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  17. Accuracy of noninvasive coronary stenosis quantification of different commercially available dedicated software packages.

    Science.gov (United States)

    Dikkers, Riksta; Willems, Tineke P; de Jonge, Gonda J; Marquering, Henk A; Greuter, Marcel J W; van Ooijen, Peter M A; van der Weide, Marijke C Jansen; Oudkerk, Matthijs

    2009-01-01

    The purpose of this study was to investigate the noninvasive quantification of coronary artery stenosis using cardiac software packages and vessel phantoms with known stenosis severity. Four different sizes of vessel phantoms were filled with contrast agent and scanned on a 64-slice multidetector computed tomography. Diameter and area stenosis were evaluated by 2 observers blinded from the true measures using 5 different software packages. Measurements were compared with the true measure of the vessel phantoms. The absolute difference in stenosis measurements and intraobserver and interobserver variabilities were assessed. All software packages show a trend toward larger differences for the smaller vessel phantoms. The absolute difference of the automatic measurements was significantly higher compared with that of the manual measurements in all 5 evaluated software packages for all vessel phantoms (P < 0.05). Manual stenosis measurements are significantly more accurate compared with automatic measurements, and therefore, manual adjustments are still essential for noninvasive assessment of coronary artery stenosis.

  18. Development of a new control software package for Pakistan Research Reactor-2

    International Nuclear Information System (INIS)

    Qazi, M.K.

    1993-05-01

    The development of a new control software package for Pakistan Research Reactor-2 is presented. The software operates in different modes which comprises of surveillance, pre-operational self tests, operator, supervisor and robotic control. The control logic critically damp the system minimizing power overshoots. The software, handles multiple abnormal conditions, provides an elaborate access control and maintains startup/shutdown record. The report describes the functional details and covers the operational aspects of the new control software. (author)

  19. Software Frameworks for Model Composition

    Directory of Open Access Journals (Sweden)

    Mikel D. Petty

    2014-01-01

    Full Text Available A software framework is an architecture or infrastructure intended to enable the integration and interoperation of software components. Specialized types of software frameworks are those specifically intended to support the composition of models or other components within a simulation system. Such frameworks are intended to simplify the process of assembling a complex model or simulation system from simpler component models as well as to promote the reuse of the component models. Several different types of software frameworks for model composition have been designed and implemented; those types include common library, product line architecture, interoperability protocol, object model, formal, and integrative environment. The various framework types have different components, processes for composing models, and intended applications. In this survey the fundamental terms and concepts of software frameworks for model composition are presented, the different types of such frameworks are explained and compared, and important examples of each type are described.

  20. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    Science.gov (United States)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  1. Western aeronautical test range real-time graphics software package MAGIC

    Science.gov (United States)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  2. User's guide to EAGLES Version 1.1: An electric- and gasoline-vehicle fuel-efficiency software package

    Science.gov (United States)

    Marr, W. W.

    1995-01-01

    EAGLES is an interactive microcomputer software package for the analysis of fuel efficiency in electric-vehicle (EV) applications or the estimation of fuel economy for a gasoline vehicle. The principal objective of the EV analysis is to enable the prediction of EV performance on the basis of laboratory test data for batteries. The EV model included in the software package provides a second-by-second simulation of battery voltage and current for any specified vehicle velocity/time or power/time profile. The capability of the battery is modeled by an algorithm that relates the battery voltage to the withdrawn (or charged) current, taking into account the effect of battery depth-of-discharge. Alternatively, the software package can be used to determine the size of the battery needed to satisfy given vehicle mission requirements. For gasoline vehicles, a generic fuel-economy model based on data from EPA Test Car List 1991 is included in the software package. For both types of vehicles, effects of heating/cooling loads on vehicle performance, including range penalty for EVs, can be studied. Also available is an option to estimate the time needed by a specified vehicle to reach a certain speed with the application of a constant power and an option to compute the fraction of time and/or distance in a driving cycle at speeds exceeding a specified value. Certain parameters can be changed interactively prior to a run.

  3. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  4. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    Science.gov (United States)

    Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei

    2014-01-01

    Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  5. Documentation package for the RFID temperature monitoring system (Model 9977 packages at NTS)

    International Nuclear Information System (INIS)

    Chen, K.; Tsai, H.

    2009-01-01

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it can be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The

  6. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  7. Bioinactivation: Software for modelling dynamic microbial inactivation.

    Science.gov (United States)

    Garre, Alberto; Fernández, Pablo S; Lindqvist, Roland; Egea, Jose A

    2017-03-01

    This contribution presents the bioinactivation software, which implements functions for the modelling of isothermal and non-isothermal microbial inactivation. This software offers features such as user-friendliness, modelling of dynamic conditions, possibility to choose the fitting algorithm and generation of prediction intervals. The software is offered in two different formats: Bioinactivation core and Bioinactivation SE. Bioinactivation core is a package for the R programming language, which includes features for the generation of predictions and for the fitting of models to inactivation experiments using non-linear regression or a Markov Chain Monte Carlo algorithm (MCMC). The calculations are based on inactivation models common in academia and industry (Bigelow, Peleg, Mafart and Geeraerd). Bioinactivation SE supplies a user-friendly interface to selected functions of Bioinactivation core, namely the model fitting of non-isothermal experiments and the generation of prediction intervals. The capabilities of bioinactivation are presented in this paper through a case study, modelling the non-isothermal inactivation of Bacillus sporothermodurans. This study has provided a full characterization of the response of the bacteria to dynamic temperature conditions, including confidence intervals for the model parameters and a prediction interval of the survivor curve. We conclude that the MCMC algorithm produces a better characterization of the biological uncertainty and variability than non-linear regression. The bioinactivation software can be relevant to the food and pharmaceutical industry, as well as to regulatory agencies, as part of a (quantitative) microbial risk assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  9. Effective organizational solutions for implementation of DBMS software packages

    Science.gov (United States)

    Jones, D.

    1984-01-01

    The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.

  10. A software package for the full GBTX lifecycle

    CERN Document Server

    Feger, S; Marin, M Barros; Leitao, P; Moreira, P; Porret, D; Wyllie, K

    2015-01-01

    This work presents the software environment surrounding the GBTX. The GBTX is a high speed bidirectional ASIC, implementing radiation hard optical links for high-energy physics experiments. Having more than 300 8-bit configuration registers, it poses challenges addressed by a wide variety of software components. This paper focuses on the software used for characterization as well as radiation and production testing of the GBTX. It also highlights tools made available to the designers and users, enabling them to create customized configurations. The paper shows how storing data for the full GBTX lifecycle is planned to ensure a good quality tracking of their devices.

  11. Calculation of the relative metastabilities of proteins using the CHNOSZ software package

    Directory of Open Access Journals (Sweden)

    Dick Jeffrey M

    2008-10-01

    Full Text Available Abstract Background Proteins of various compositions are required by organisms inhabiting different environments. The energetic demands for protein formation are a function of the compositions of proteins as well as geochemical variables including temperature, pressure, oxygen fugacity and pH. The purpose of this study was to explore the dependence of metastable equilibrium states of protein systems on changes in the geochemical variables. Results A software package called CHNOSZ implementing the revised Helgeson-Kirkham-Flowers (HKF equations of state and group additivity for ionized unfolded aqueous proteins was developed. The program can be used to calculate standard molal Gibbs energies and other thermodynamic properties of reactions and to make chemical speciation and predominance diagrams that represent the metastable equilibrium distributions of proteins. The approach takes account of the chemical affinities of reactions in open systems characterized by the chemical potentials of basis species. The thermodynamic database included with the package permits application of the software to mineral and other inorganic systems as well as systems of proteins or other biomolecules. Conclusion Metastable equilibrium activity diagrams were generated for model cell-surface proteins from archaea and bacteria adapted to growth in environments that differ in temperature and chemical conditions. The predicted metastable equilibrium distributions of the proteins can be compared with the optimal growth temperatures of the organisms and with geochemical variables. The results suggest that a thermodynamic assessment of protein metastability may be useful for integrating bio- and geochemical observations.

  12. Analysis of software for modeling atmospheric dispersion

    International Nuclear Information System (INIS)

    Grandamas, O.; Hubert, Ph.; Pages, P.

    1989-09-01

    During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site [fr

  13. QuickDirect - Payload Control Software Template Package Project

    Data.gov (United States)

    National Aeronautics and Space Administration — To address the need to quickly, cost-effectively and reliably develop software to control science instruments deployed on spacecraft, QuickFlex proposes to create a...

  14. Laboratory Connections. Commercial Interfacing Packages: Part II: Software and Applications.

    Science.gov (United States)

    Powers, Michael H.

    1989-01-01

    Describes the titration of a weak base with a strong acid and subsequent treatment of experimental data using commercially available software. Provides a BASIC program for determining first and second derivatives of data input. Lists 6 references. (YP)

  15. BEANS - a software package for distributed Big Data analysis

    Science.gov (United States)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  16. QuickDirect - Payload Control Software Template Package, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — To address the need to quickly, cost-effectively and reliably develop software to control science instruments deployed on spacecraft, QuickFlex proposes to create a...

  17. Investigating the effects of different factors on development of open source enterprise resources planning software packages

    Directory of Open Access Journals (Sweden)

    Mehdi Ghorbaninia

    2014-08-01

    Full Text Available This paper investigates the effects of different factors on development of open source enterprise resources planning software packages. The study designs a questionnaire in Likert scale and distributes it among 210 experts in the field of open source software package development. Cronbach alpha has been calculated as 0.93, which is well above the minimum acceptable level. Using Pearson correlation as well as stepwise regression analysis, the study determines three most important factors including fundamental issues, during and after implementation of open source software development. The study also determines a positive and strong relationship between fundamental factors and after implementation factors (r=0.9006, Sig. = 0.000.

  18. Modeling software systems by domains

    Science.gov (United States)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  19. The quality and testing PH-SFT infrastructure for the external LHC software packages deployment

    CERN Multimedia

    CERN. Geneva; MENDEZ LORENZO, Patricia; MATO VILA, Pere

    2015-01-01

    The PH-SFT group is responsible for the build, test, and deployment of the set of external software packages used by the LHC experiments. This set includes ca. 170 packages including Grid packages and Montecarlo generators provided for different versions. A complete build structure has been established to guarantee the quality of the packages provided by the group. This structure includes an experimental build and three daily nightly builds, each of them dedicated to a specific ROOT version including v6.02, v6.04, and the master. While the former build is dedicated to the test of new packages, versions and dependencies (basically SFT internal used), the three latter ones are the responsible for the deployment to AFS of the set of stable and well tested packages requested by the LHC experiments so they can apply their own builds on top. In all cases, a c...

  20. Dynamic modelling of packaging material flow systems.

    Science.gov (United States)

    Tsiliyannis, Christos A

    2005-04-01

    A dynamic model has been developed for reused and recycled packaging material flows. It allows a rigorous description of the flows and stocks during the transition to new targets imposed by legislation, product demand variations or even by variations in consumer discard behaviour. Given the annual reuse and recycle frequency and packaging lifetime, the model determines all packaging flows (e.g., consumption and reuse) and variables through which environmental policy is formulated, such as recycling, waste and reuse rates and it identifies the minimum number of variables to be surveyed for complete packaging flow monitoring. Simulation of the transition to the new flow conditions is given for flows of packaging materials in Greece, based on 1995--1998 field inventory and statistical data.

  1. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  2. Energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-01-01

    Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...

  3. A process control software package for the SRS

    International Nuclear Information System (INIS)

    Atkins, V.R.; Poole, D.E.; Rawlinson, W.R.

    1980-03-01

    The development of software to give high level access from application programs for monitoring and control of the Daresbury Synchrotron Radiation Source on a network-wide basis is described. The design and implementation of the control system database, a special supervisor call and and 'executive' type task handling of all process input/output services for the 7/32 (which runs under 05/32-MT), and process control 'device driver' software for the 7/16 (run under L5/16-MT) are included. (UK)

  4. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  5. The art of software modeling

    CERN Document Server

    Lieberman, Benjamin A

    2007-01-01

    Modeling complex systems is a difficult challenge and all too often one in which modelers are left to their own devices. Using a multidisciplinary approach, The Art of Software Modeling covers theory, practice, and presentation in detail. It focuses on the importance of model creation and demonstrates how to create meaningful models. Presenting three self-contained sections, the text examines the background of modeling and frameworks for organizing information. It identifies techniques for researching and capturing client and system information and addresses the challenges of presenting models to specific audiences. Using concepts from art theory and aesthetics, this broad-based approach encompasses software practices, cognitive science, and information presentation. The book also looks at perception and cognition of diagrams, view composition, color theory, and presentation techniques. Providing practical methods for investigating and organizing complex information, The Art of Software Modeling demonstrate...

  6. Evaluation of open source data mining software packages

    Science.gov (United States)

    Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt

    2009-01-01

    Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...

  7. Open Source Scanning Probe Microscopy Control Software Package Gxsm

    Energy Technology Data Exchange (ETDEWEB)

    Zahl P.; Wagner, T.; Moller, R.; Klust, A.

    2009-08-10

    Gxsm is a full featured and modern scanning probe microscopy (SPM) software. It can be used for powerful multidimensional image/data processing, analysis, and visualization. Connected toan instrument, it is operating many different avors of SPM, e.g., scanning tunneling microscopy(STM) and atomic force microscopy (AFM) or in general two-dimensional multi channel data acquisition instruments. The Gxsm core can handle different data types, e.g., integer and oating point numbers. An easily extendable plug-in architecture provides many image analysis and manipulation functions. A digital signal processor (DSP) subsystem runs the feedback loop, generates the scanning signals and acquires the data during SPM measurements. The programmable Gxsm vector probe engine performs virtually any thinkable spectroscopy and manipulation task, such as scanning tunneling spectroscopy (STS) or tip formation. The Gxsm software is released under the GNU general public license (GPL) and can be obtained via the Internet.

  8. A Relative Comparison of Leading Supply Chain Management Software Packages

    OpenAIRE

    Zhongxian Wang; Ruiliang Yan; Kimberly Hollister; Ruben Xing

    2009-01-01

    Supply Chain Management (SCM) has proven to be an effective tool that aids companies in the development of competitive advantages. SCM Systems are relied on to manage warehouses, transportation, trade logistics and various other issues concerning the coordinated movement of products and services from suppliers to customers. Although in today’s fast paced business environment, numerous supply chain solution tools are readily available to companies, choosing the right SCM software is not an e...

  9. The Caviar software package for the astrometric reduction of Cassini ISS images: description and examples

    Science.gov (United States)

    Cooper, N. J.; Lainey, V.; Meunier, L.-E.; Murray, C. D.; Zhang, Q.-F.; Baillie, K.; Evans, M. W.; Thuillot, W.; Vienne, A.

    2018-02-01

    Aims: Caviar is a software package designed for the astrometric measurement of natural satellite positions in images taken using the Imaging Science Subsystem (ISS) of the Cassini spacecraft. Aspects of the structure, functionality, and use of the software are described, and examples are provided. The integrity of the software is demonstrated by generating new measurements of the positions of selected major satellites of Saturn, 2013-2016, along with their observed minus computed (O-C) residuals relative to published ephemerides. Methods: Satellite positions were estimated by fitting a model to the imaged limbs of the target satellites. Corrections to the nominal spacecraft pointing were computed using background star positions based on the UCAC5 and Tycho2 star catalogues. UCAC5 is currently used in preference to Gaia-DR1 because of the availability of proper motion information in UCAC5. Results: The Caviar package is available for free download. A total of 256 new astrometric observations of the Saturnian moons Mimas (44), Tethys (58), Dione (55), Rhea (33), Iapetus (63), and Hyperion (3) have been made, in addition to opportunistic detections of Pandora (20), Enceladus (4), Janus (2), and Helene (5), giving an overall total of 287 new detections. Mean observed-minus-computed residuals for the main moons relative to the JPL SAT375 ephemeris were - 0.66 ± 1.30 pixels in the line direction and 0.05 ± 1.47 pixels in the sample direction. Mean residuals relative to the IMCCE NOE-6-2015-MAIN-coorb2 ephemeris were -0.34 ± 0.91 pixels in the line direction and 0.15 ± 1.65 pixels in the sample direction. The reduced astrometric data are provided in the form of satellite positions for each image. The reference star positions are included in order to allow reprocessing at some later date using improved star catalogues, such as later releases of Gaia, without the need to re-estimate the imaged star positions. The Caviar software is available for free download from: ftp://ftp://ftp.imcce.fr/pub/softwares

  10. StreamThermal: A software package for calculating thermal metrics from stream temperature data

    Science.gov (United States)

    Tsang, Yin-Phan; Infante, Dana M.; Stewart, Jana S.; Wang, Lizhu; Tingly, Ralph; Thornbrugh, Darren; Cooper, Arthur; Wesley, Daniel

    2016-01-01

    Improving quality and better availability of continuous stream temperature data allows natural resource managers, particularly in fisheries, to understand associations between different characteristics of stream thermal regimes and stream fishes. However, there is no convenient tool to efficiently characterize multiple metrics reflecting stream thermal regimes with the increasing amount of data. This article describes a software program packaged as a library in R to facilitate this process. With this freely-available package, users will be able to quickly summarize metrics that describe five categories of stream thermal regimes: magnitude, variability, frequency, timing, and rate of change. The installation and usage instruction of this package, the definition of calculated thermal metrics, as well as the output format from the package are described, along with an application showing the utility for multiple metrics. We believe this package can be widely utilized by interested stakeholders and greatly assist more studies in fisheries.

  11. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  12. The design of graphical interface software package in safeguards NDA (non-destructive nuclear analysis) system

    International Nuclear Information System (INIS)

    Tan Yajun

    1993-01-01

    The general method of graphical interface is analysed, the design technique of some subroutines (text characters, multiple layer pull-down menu, multiple windows, etc.) is put forward. By giving a actual package used in safeguards NDA (non-destructive nuclear analysis) system, it shows that the software system is better than the current international packages in friendly interface, spectrum dynamical display and plotting. The algorithms of software design are suitable for not only general nuclear spectrum acquisition and analysis system, but also general microcomputer graphical interface design

  13. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  14. PKgraph: an R package for graphically diagnosing population pharmacokinetic models.

    Science.gov (United States)

    Sun, Xiaoyong; Wu, Kai; Cook, Dianne

    2011-12-01

    Population pharmacokinetic (PopPK) modeling has become increasing important in drug development because it handles unbalanced design, sparse data and the study of individual variation. However, the increased complexity of the model makes it more of a challenge to diagnose the fit. Graphics can play an important and unique role in PopPK model diagnostics. The software described in this paper, PKgraph, provides a graphical user interface for PopPK model diagnosis. It also provides an integrated and comprehensive platform for the analysis of pharmacokinetic data including exploratory data analysis, goodness of model fit, model validation and model comparison. Results from a variety of modeling fitting software, including NONMEM, Monolix, SAS and R, can be used. PKgraph is programmed in R, and uses the R packages lattice, ggplot2 for static graphics, and rggobi for interactive graphics. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis

    Directory of Open Access Journals (Sweden)

    Léo Botton-Divet

    2015-11-01

    Full Text Available The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic, structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages (‘Edgewarp’ and ‘Morpho’ for the same sliding task, and investigate potential differences in the results and biological interpretation. ‘Morpho’ is much faster than ‘Edgewarp,’ notably as a result of the greater computational power of the ‘Morpho’ software routines and the complexity of the ‘Edgewarp’ workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses.

  16. In-field inspection support software: A status report on the Common Inspection On-site Software Package (CIOSP) project

    International Nuclear Information System (INIS)

    Novatchev, Dimitre; Titov, Pavel; Siradjov, Bakhtiiar; Vlad, Ioan; Xiao Jing

    2001-01-01

    Full text: IAEA has invested much thought and effort into developing software that can assist inspectors during their inspection work. Experience with such applications has been steadily growing and IAEA has recently commissioned a next-generation software package. This kind of software accommodates inspection tasks that can vary substantially in function depending on the type of installation being inspected as well as ensures that the resulting software package has a wide range of usability and can preclude excessive development of plant-specific applications. The Common Inspection On-site Software Package is being developed in the Department of Safeguards to address the limitations of the existing software and to expand its coverage of the inspection process. CIOSP is 'common' in that it is aimed at providing support for as many facilities as possible with the minimum re-configuration. At the same time it has to cater to varying needs of individual facilities, different instrumentation and verification methods used. A component-based approach was taken to successfully tackle the challenges that the development of this software presented. CIOSP consists of the following major components: A framework into which individual plug-ins supporting various inspection activities can integrate at run-time; A central data store containing all facility configuration data and all data collected during inspections; A local data store, which resides on the inspector's computer, where the current inspection's data is stored; A set of services used by all plug-ins (i.e. data transformation, authentication, replication services etc.). This architecture allows for incremental development and extension of the software with plug-ins that support individual inspection activities. The core set of components along with the framework, the Inventory Verification, Book Examination and Records and Reports Comparison plug-ins have been developed. The development of the Short Notice Random

  17. PALSfit3: A software package for analysing positron lifetime spectra

    DEFF Research Database (Denmark)

    Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard

    The present report describes a Windows based computer program called PALSfit3. The purpose of the program is to carry out analyses of spectra that have been measured by positron annihilation lifetime spectroscopy (PALS). PALSfit3 is based on the well tested PATFIT and PALS fit programs, which hav...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)......The present report describes a Windows based computer program called PALSfit3. The purpose of the program is to carry out analyses of spectra that have been measured by positron annihilation lifetime spectroscopy (PALS). PALSfit3 is based on the well tested PATFIT and PALS fit programs, which have...

  18. The SETI Interpreter Program (SIP). a Software Package for the SETI Field Tests

    Science.gov (United States)

    Olsen, E. T.; Lokshin, A.

    1983-01-01

    The SETI (Search for Extraterrestrial Intelligence) Interpreter Program (SIP) is an interactive software package designed to allow flexible off line processing of the SETI field test data on a PDP 11/44 computer. The user can write and immediately execute complex analysis programs using the compact SIP command language. The software utilized by the SETI Interpreter Program consists of FORTRAN - coded modules that are sequentially installed and executed.

  19. Implementation of a combined association-linkage model for quantitative traits in linear mixed model procedures of statistical packages

    NARCIS (Netherlands)

    Beem, A. Leo; Boomsma, Dorret I.

    2006-01-01

    A transmission disequilibrium test for quantitative traits which combines association and linkage analyses is currently available in several dedicated software packages. We describe how to implement such models in linear mixed model procedures that are available in widely used statistical packages

  20. A Quantitative Software Risk Assessment Model

    Science.gov (United States)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  1. Comparison of four software packages for CT lung volumetry in healthy individuals

    Energy Technology Data Exchange (ETDEWEB)

    Nemec, Stefan F. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Molinari, Francesco [Centre Hospitalier Regional Universitaire de Lille, Department of Radiology, Lille (France); Dufresne, Valerie [CHU de Charleroi - Hopital Vesale, Pneumologie, Montigny-le-Tilleul (Belgium); Gosset, Natacha [CHU Tivoli, Service d' Imagerie Medicale, La Louviere (Belgium); Silva, Mario; Bankier, Alexander A. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States)

    2015-06-01

    To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. (orig.)

  2. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...

  3. PyPedal, an open source software package for pedigree analysis

    Science.gov (United States)

    The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...

  4. A SaTScan™ macro accessory for cartography (SMAC package implemented with SAS® software

    Directory of Open Access Journals (Sweden)

    Kleinman Ken P

    2007-03-01

    Full Text Available Abstract Background SaTScan is a software program written to implement the scan statistic; it can be used to find clusters in space and/or time. It must often be run multiple times per day when doing disease surveillance. Running SaTScan frequently via its graphical user interface can be cumbersome, and the output can be difficult to visualize. Results The SaTScan Macro Accessory for Cartography (SMAC package consists of four SAS macros and was designed as an easier way to run SaTScan multiple times and add graphical output. The package contains individual macros which allow the user to make the necessary input files for SaTScan, run SaTScan, and create graphical output all from within SAS software. The macros can also be combined to do this all in one step. Conclusion The SMAC package can make SaTScan easier to use and can make the output more informative.

  5. Software verification and validation for commercial statistical packages utilized by the statistical consulting section of SRTC

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.B.

    2000-03-22

    The purpose of this report is to provide software verification and validation for the statistical packages used by the Statistical Consulting Section (SCS) of the Savannah River Technology Center. The need for this verification and validation stems from the requirements of the Quality Assurance programs that are frequently applicable to the work conducted by SCS. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore the software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are reevaluated using these new tools, this report is to be revised to address their verification and validation.

  6. A SaTScan macro accessory for cartography (SMAC) package implemented with SAS software.

    Science.gov (United States)

    Abrams, Allyson M; Kleinman, Ken P

    2007-03-06

    SaTScan is a software program written to implement the scan statistic; it can be used to find clusters in space and/or time. It must often be run multiple times per day when doing disease surveillance. Running SaTScan frequently via its graphical user interface can be cumbersome, and the output can be difficult to visualize. The SaTScan Macro Accessory for Cartography (SMAC) package consists of four SAS macros and was designed as an easier way to run SaTScan multiple times and add graphical output. The package contains individual macros which allow the user to make the necessary input files for SaTScan, run SaTScan, and create graphical output all from within SAS software. The macros can also be combined to do this all in one step. The SMAC package can make SaTScan easier to use and can make the output more informative.

  7. INSPECT: A graphical user interface software package for IDARC-2D

    Science.gov (United States)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  8. INSPECT: A graphical user interface software package for IDARC-2D

    Directory of Open Access Journals (Sweden)

    Mohammad AlHamaydeh

    2016-01-01

    Full Text Available Modern day Performance-Based Earthquake Engineering (PBEE pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  9. MEASURE/ANOMTEST. Anomaly detection software package for the Dodewaard power plant facility. Supplement 1. Extension of measurement analysis part, addition of plot package

    International Nuclear Information System (INIS)

    Schoonewelle, H.

    1995-01-01

    The anomaly detection software package installed at the Dodewaard nuclear power plant has been revised with respect to the part of the measurement analysis. A plot package has been added to the package. Signals in which an anomaly has been detected are automatically plotted including the uncertainty margins of the signals. This report gives a description of the revised measurement analysis part and the plot package. Each new routine of the plot package is described briefly and the new input and output files are given. (orig.)

  10. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  11. DFI Computer Modeling Software (CMS)

    Energy Technology Data Exchange (ETDEWEB)

    Cazalet, E.G.; Deziel, L.B. Jr.; Haas, S.M.; Martin, T.W.; Nesbitt, D.M.; Phillips, R.L.

    1979-10-01

    The data base management system used to create, edit and store models data and solutions for the LEAP system is described. The software is entirely in FORTRAN-G for the IBM 370 series of computers and provides interface with a commercial data base system SYSTEM-2000.

  12. Motorola Secure Software Development Model

    Directory of Open Access Journals (Sweden)

    Francis Mahendran

    2008-08-01

    Full Text Available In today's world, the key to meeting the demand for improved security is to implement repeatable processes that reliably deliver measurably improved security. While many organizations have announced efforts to institutionalize a secure software development process, there is little or no industry acceptance for a common process improvement framework for secure software development. Motorola has taken the initiative to develop such a framework, and plans to share this with the Software Engineering Institute for possible inclusion into its Capability Maturity Model Integration (CMMI®. This paper will go into the details of how Motorola is addressing this issue. The model that is being developed is designed as an extension of the existing CMMI structure. The assumption is that the audience will have a basic understanding of the SEI CMM® / CMMI® process framework. The paper will not describe implementation details of a security process model or improvement framework, but will address WHAT security practices are required for a company with many organizations operating at different maturity levels. It is left to the implementing organization to answer the HOW, WHEN, WHO and WHERE aspects. The paper will discuss how the model is being implemented in the Motorola Software Group.

  13. Calculation of chemical equilibrium between aqueous solution and minerals: the EQ3/6 software package

    International Nuclear Information System (INIS)

    Wolery, T.J.

    1979-01-01

    The newly developed EQ/36 software package computes equilibrium models of aqueous geochemical systems. The package contains two principal programs: EQ3 performs distribution-of-species calculations for natural water compositions; EQ6 uses the results of EQ3 to predict the consequences of heating and cooling aqueous solutions and of irreversible reaction in rock--water systems. The programs are valuable for studying such phenomena as the formation of ore bodies, scaling and plugging in geothermal development, and the long-term disposal of nuclear waste. EQ3 and EQ6 are compared with such well-known geochemical codes as SOLMNEQ, WATEQ, REDEQL, MINEQL, and PATHI. The data base allows calculations in the temperature interval 0 to 350 0 C, at either 1 atm-steam saturation pressures or a constant 500 bars. The activity coefficient approximations for aqueous solutes limit modeling to solutions of ionic strength less than about one molal. The mathematical derivations and numerical techniques used in EQ6 are presented in detail. The program uses the Newton--Raphson method to solve the governing equations of chemical equilibrium for a system of specified elemental composition at fixed temperature and pressure. Convergence is aided by optimizing starting estimates and by under-relaxation techniques. The minerals present in the stable phase assemblage are found by several empirical methods. Reaction path models may be generated by using this approach in conjunction with finite differences. This method is analogous to applying high-order predictor--corrector methods to integrate a corresponding set of ordinary differential equations, but avoids propagation of error (drift). 8 figures, 9 tables

  14. Accuracy of Giovanni and Marksim Software Packages for ...

    African Journals Online (AJOL)

    Agricultural adaptation to climate change requires accurate, unbiased, and reliable climate data. Availability of observed climatic data is limited because of inadequate weather stations. Rainfall simulation models are important tools for generating rainfall data in areas with limited or no observed data. Various weather ...

  15. Economic tour package model using heuristic

    Science.gov (United States)

    Rahman, Syariza Abdul; Benjamin, Aida Mauziah; Bakar, Engku Muhammad Nazri Engku Abu

    2014-07-01

    A tour-package is a prearranged tour that includes products and services such as food, activities, accommodation, and transportation, which are sold at a single price. Since the competitiveness within tourism industry is very high, many of the tour agents try to provide attractive tour-packages in order to meet tourist satisfaction as much as possible. Some of the criteria that are considered by the tourist are the number of places to be visited and the cost of the tour-packages. Previous studies indicate that tourists tend to choose economical tour-packages and aiming to visit as many places as they can cover. Thus, this study proposed tour-package model using heuristic approach. The aim is to find economical tour-packages and at the same time to propose as many places as possible to be visited by tourist in a given geographical area particularly in Langkawi Island. The proposed model considers only one starting point where the tour starts and ends at an identified hotel. This study covers 31 most attractive places in Langkawi Island from various categories of tourist attractions. Besides, the allocation of period for lunch and dinner are included in the proposed itineraries where it covers 11 popular restaurants around Langkawi Island. In developing the itinerary, the proposed heuristic approach considers time window for each site (hotel/restaurant/place) so that it represents real world implementation. We present three itineraries with different time constraints (1-day, 2-day and 3-day tour-package). The aim of economic model is to minimize the tour-package cost as much as possible by considering entrance fee of each visited place. We compare the proposed model with our uneconomic model from our previous study. The uneconomic model has no limitation to the cost with the aim to maximize the number of places to be visited. Comparison between the uneconomic and economic itinerary has shown that the proposed model have successfully achieved the objective that

  16. DFBIdb: a software package for neuroimaging data management.

    Science.gov (United States)

    Adamson, Christopher L; Wood, Amanda G

    2010-12-01

    We present DFBIdb: a suite of tools for efficient management of neuroimaging project data. Specifically, DFBIdb was designed to allow users to quickly perform routine management tasks of sorting, archiving, exploring, exporting and organising raw data. DFBIdb was implemented as a collection of Python scripts that maintain a project-based, centralised database that is based on the XCEDE 2 data model. Project data is imported from a filesystem hierarchy of raw files, which is an often-used convention of imaging devices, using a single script that catalogues meta-data into a modified XCEDE 2 data model. During the import process data are reversibly anonymised, archived and compressed. The import script was designed to support multiple file formats and features an extensible framework that can be adapted to novel file formats. An ACL-based security model, with accompanying graphical management tools, was implemented to provide a straightforward method to restrict access to raw and meta-data. Graphical user interfaces are provided for data exploration. DFBIdb includes facilities to export, convert and organise customisable subsets of project data according to user-specified criteria. The command-line interface was implemented to allow users to incorporate database commands into more complex scripts that may be utilised to automate data management tasks. By using DFBIdb, neuroimaging laboratories will be able to perform routine data management tasks in an efficient manner.

  17. Analysing the Zenith Tropospheric Delay Estimates in On-line Precise Point Positioning (PPP) Services and PPP Software Packages.

    Science.gov (United States)

    Mendez Astudillo, Jorge; Lau, Lawrence; Tang, Yu-Ting; Moore, Terry

    2018-02-14

    As Global Navigation Satellite System (GNSS) signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP) technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD) is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS) tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (PPP services perform better than the selected PPP software packages at all stations.

  18. Groundwater movement simulation by the software package PM5 for the Sviyaga river adjoining territory in the Republic of Tatarstan

    Science.gov (United States)

    Kosterina, E. A.; Isagadzhieva, Z. Sh

    2018-01-01

    Data of the ecological-hydrogeological fieldwork at the Predvolzhye region of the Republic of Tatarstan were analyzed. A geofiltration model of the Buinsk region area near the village of Stary Studenets in the territory of the Republic of Tatarstan was constructed by the PM5 software package. The model can be developed to become the basis for estimation of the groundwater reserves of the territory, modeling the operation of water intake wells, designing the location of water intake wells, and evaluation of their operational capabilities, and constructing sanitary protection zones.

  19. Dynamic modelling and PID loop control of an oil-injected screw compressor package

    Science.gov (United States)

    Poli, G. W.; Milligan, W. J.; McKenna, P.

    2017-08-01

    A significant amount of time is spent tuning the PID (Proportional, Integral and Derivative) control loops of a screw compressor package due to the unique characteristics of the system. Common mistakes incurred during the tuning of a PID control loop include improper PID algorithm selection and unsuitable tuning parameters of the system resulting in erratic and inefficient operation. This paper details the design and development of software that aims to dynamically model the operation of a single stage oil injected screw compressor package deployed in upstream oil and gas applications. The developed software will be used to assess and accurately tune PID control loops present on the screw compressor package employed in controlling the oil pressures, temperatures and gas pressures, in a bid to improve control of the operation of the screw compressor package. Other applications of the modelling software will include its use as an evaluation tool that can estimate compressor package performance during start up, shutdown and emergency shutdown processes. The paper first details the study into the fundamental operational characteristics of each of the components present on the API 619 screw compressor package and then discusses the creation of a dynamic screw compressor model within the MATLAB/Simulink software suite. The paper concludes by verifying and assessing the accuracy of the created compressor model using data collected from physical screw compressor packages.

  20. Establishing the Common Community Physics Package by Transitioning the GFS Physics to a Collaborative Software Framework

    Science.gov (United States)

    Xue, L.; Firl, G.; Zhang, M.; Jimenez, P. A.; Gill, D.; Carson, L.; Bernardet, L.; Brown, T.; Dudhia, J.; Nance, L. B.; Stark, D. R.

    2017-12-01

    The Global Model Test Bed (GMTB) has been established to support the evolution of atmospheric physical parameterizations in NCEP global modeling applications. To accelerate the transition to the Next Generation Global Prediction System (NGGPS), a collaborative model development framework known as the Common Community Physics Package (CCPP) is created within the GMTB to facilitate engagement from the broad community on physics experimentation and development. A key component to this Research to Operation (R2O) software framework is the Interoperable Physics Driver (IPD) that hooks the physics parameterizations from one end to the dynamical cores on the other end with minimum implementation effort. To initiate the CCPP, scientists and engineers from the GMTB separated and refactored the GFS physics. This exercise demonstrated the process of creating IPD-compliant code and can serve as an example for other physics schemes to do the same and be considered for inclusion into the CCPP. Further benefits to this process include run-time physics suite configuration and considerably reduced effort for testing modifications to physics suites through GMTB's physics test harness. The implementation will be described and the preliminary results will be presented at the conference.

  1. Generic domain models in software engineering

    Science.gov (United States)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  2. Opensource Software for MLR-Modelling of Solar Collectors

    DEFF Research Database (Denmark)

    Bacher, Peder; Perers, Bengt

    2011-01-01

    source program R http://www.r-project.org/. Applications of the software package includes: visual validation, resampling and conversion of data, collector performance testing analysis according to the European Standard EN 12975 (Fischer et al., 2004), statistical validation of results...... area flat plate collector with selective absorber and teflon anti convection layer. The package is intended to enable fast and reliable validation of data, and provide a united implementation for MLR testing of solar collectors. This will furthermore make it simple to replicate the calculations......A first research version is now in operation of a software package for multiple linear regression (MLR) modeling and analysis of solar collectors according to ideas originating all the way from Walletun et. al. (1986), Perers, (1987 and 1993). The tool has been implemented in the free and open...

  3. ACEMAN (II): a PDP-11 software package for acoustic emission analysis

    International Nuclear Information System (INIS)

    Tobias, A.

    1976-01-01

    A powerful, but easy-to-use, software package (ACEMAN) for acoustic emission analysis has been developed at Berkeley Nuclear Laboratories. The system is based on a PDP-11 minicomputer with 24 K of memory, an RK05 DISK Drive and a Tektronix 4010 Graphics terminal. The operation of the system is described in detail in terms of the functions performed in response to the various command mnemonics. The ACEMAN software package offers many useful facilities not found on other acoustic emission monitoring systems. Its main features, many of which are unique, are summarised. The ACEMAN system automatically handles arrays of up to 12 sensors in real-time operation during which data are acquired, analysed, stored on the computer disk for future analysis and displayed on the terminal if required. (author)

  4. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner a......The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the Geo...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  5. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  6. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner...... approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  7. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  8. Pre- and post-processing of data for simulation of gyrotrons by the GYREOSS software package

    Energy Technology Data Exchange (ETDEWEB)

    Damyanova, M; Sabchevski, S [Academician Emil Djakov Institute of Electronics, Bulgarian Academy of Sciences, 72 Tzarigradso Schaussee Blvd., BG-1784 Sofia (Bulgaria); Zhelyazkov, I, E-mail: sabchevski@yahoo.co [Faculty of Physics, Sofia University, 5 James Bourchier Blvd., BG-1164 Sofia (Bulgaria)

    2010-01-01

    In this paper, we present the concept of the pre- and post-processing of data for simulation of gyrotrons by the GYREOSS software package, which is being developed now. It is based on the utilization of an appropriate code for description and input of the geometry of the tube in three dimensions, for discretization of the computational region by unstructured tetrahedral mesh and for visualization and manipulation of the results of numerical experiments.

  9. Quantitation of magnetic resonance spectroscopy signals: the jMRUI software package

    Czech Academy of Sciences Publication Activity Database

    Stefan, D.; Di Cesare, F.; Andrasescu, A.; Popa, E.; Lazariev, A.; Vescovo, E.; Štrbák, Oliver; Williams, S.; Starčuk jr., Zenon; Cabanas, M.; van Ormondt, D.; Graveron-Demilly, D.

    2009-01-01

    Roč. 20, č. 10 (2009), 104035:1-9 ISSN 0957-0233 Grant - others:EC 6FP(XE) MRTN-CT-2006-035801 Source of funding: R - rámcový projekt EK Keywords : MR spectroscopy * MRS * MRSI * HRMAS-NMR * jMRUI software package * Java * plug-ins * quantitation Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.317, year: 2009

  10. Crazyseismic: A MATLAB GUI‐Based Software Package for Passive Seismic Data Preprocessing

    OpenAIRE

    Yu, Chunquan; Zheng, Yingcai; Shang, Xuefeng

    2017-01-01

    We introduce an open‐source MATLAB software package, named Crazyseismic, for passive seismic data preprocessing. Built‐in core functions such as seismic phase travel‐time calculation and multichannel cross correlation significantly improve the efficiency of data processing. Compared with conventional command‐line‐style toolboxes, all functions in Crazyseismic are embedded in one single graphic user interface (GUI). The human–machine interactive nature of GUI facilitates data quality control. ...

  11. Implementation of the INSPECT software package for statistical calculation in nuclear material accountability

    International Nuclear Information System (INIS)

    Marzo, M.A.S.

    1986-01-01

    The INSPECT software package was developed in the Pacific Northwest Laboratory for statistical calculations in nuclear material accountability. The programs apply the inspection and evaluation methodology described in Part of the Safeguards Technical Manual. In this paper the implementation of INSPECT at the Safeguards Division of CNEN, and the main characteristics of INSPECT are described. The potential applications of INSPECT to the nuclear material accountability is presented. (Author) [pt

  12. Deconvolution Estimation in Measurement Error Models: The R Package decon

    Science.gov (United States)

    Wang, Xiao-Feng; Wang, Bin

    2011-01-01

    Data from many scientific areas often come with measurement error. Density or distribution function estimation from contaminated data and nonparametric regression with errors-in-variables are two important topics in measurement error models. In this paper, we present a new software package decon for R, which contains a collection of functions that use the deconvolution kernel methods to deal with the measurement error problems. The functions allow the errors to be either homoscedastic or heteroscedastic. To make the deconvolution estimators computationally more efficient in R, we adapt the fast Fourier transform algorithm for density estimation with error-free data to the deconvolution kernel estimation. We discuss the practical selection of the smoothing parameter in deconvolution methods and illustrate the use of the package through both simulated and real examples. PMID:21614139

  13. Fast Estimation of Multinomial Logit Models: R Package mnlogit

    Directory of Open Access Journals (Sweden)

    Asad Hasan

    2016-11-01

    Full Text Available We present the R package mnlogit for estimating multinomial logistic regression models, particularly those involving a large number of categories and variables. Compared to existing software, mnlogit offers speedups of 10 - 50 times for modestly sized problems and more than 100 times for larger problems. Running in parallel mode on a multicore machine gives up to 4 times additional speedup on 8 processor cores. mnlogit achieves its computational efficiency by drastically speeding up computation of the log-likelihood function's Hessian matrix through exploiting structure in matrices that arise in intermediate calculations. This efficient exploitation of intermediate data structures allows mnlogit to utilize system memory much more efficiently, such that for most applications mnlogit requires less memory than comparable software by a factor that is proportional to the number of model categories.

  14. Safety analysis report on Model UC-609 shipping package

    International Nuclear Information System (INIS)

    Sandberg, R.R.

    1977-08-01

    This Safety Analysis Report for Packaging demonstrates that model UC-609 shipping package can safely transport tritium in any of its forms. The package and its contents are described. The package when subjected to the transport conditions specified in the Code of Federal Regulations, Title 10, Part 71 is evaluated. Finally, compliance with these regulations is discussed

  15. Software Verification and Validation for Commercial Statistical Packages Utilized by the Statistical Consulting Section of SRTC

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.B.

    2001-01-16

    The purpose of this report is to provide software verification and validation (v and v) for the statistical packages utilized by the Statistical Consulting Section (SCS) of the Savannah River Technology Center (SRTC). The need for this v and v stems from the requirements of the Quality Assurance (QA) programs that are frequently applicable to the work conducted by SCS. This document is designed to comply with software QA requirements specified in the 1Q Manual Quality Assurance Procedure 20-1, Revision 6. Revision 1 of this QA plan adds JMP Version 4 to the family of (commercially-available) statistical tools utilized by SCS. JMP Version 3.2.2 is maintained as a support option due to features unique to this version of JMP that have not as yet been incorporated into Version 4. SCS documents that include JMP output should provide a clear indication of the version or versions of JMP that were used. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore, th e software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are introduced into the Statistical Consulting Section, the appropriate problems from this report are to be re-evaluated, and this report is to be revised to address their verification and validation.

  16. A versatile software package for inter-subject correlation based analyses of fMRI.

    Science.gov (United States)

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  17. Models of the atomic nucleus. With interactive software

    International Nuclear Information System (INIS)

    Cook, N.D.

    2006-01-01

    This book-and-CD-software package supplies users with an interactive experience for nuclear visualization via a computer-graphical interface, similar in principle to the molecular visualizations already available in chemistry. Models of the Atomic Nucleus, a largely non-technical introduction to nuclear theory, explains the nucleus in a way that makes nuclear physics as comprehensible as chemistry or cell biology. The book/software supplements virtually any of the current textbooks in nuclear physics by providing a means for 3D visual display of the diverse models of nuclear structure. For the first time, an easy-to-master software for scientific visualization of the nucleus makes this notoriously ''non-visual'' field become immediately 'visible.' After a review of the basics, the book explores and compares the competing models, and addresses how the lattice model best resolves remaining controversies. The appendix explains how to obtain the most from the software provided on the accompanying CD. (orig.)

  18. PhyloNet: a software package for analyzing and reconstructing reticulate evolutionary relationships

    Directory of Open Access Journals (Sweden)

    Nakhleh Luay

    2008-07-01

    Full Text Available Abstract Background Phylogenies, i.e., the evolutionary histories of groups of taxa, play a major role in representing the interrelationships among biological entities. Many software tools for reconstructing and evaluating such phylogenies have been proposed, almost all of which assume the underlying evolutionary history to be a tree. While trees give a satisfactory first-order approximation for many families of organisms, other families exhibit evolutionary mechanisms that cannot be represented by trees. Processes such as horizontal gene transfer (HGT, hybrid speciation, and interspecific recombination, collectively referred to as reticulate evolutionary events, result in networks, rather than trees, of relationships. Various software tools have been recently developed to analyze reticulate evolutionary relationships, which include SplitsTree4, LatTrans, EEEP, HorizStory, and T-REX. Results In this paper, we report on the PhyloNet software package, which is a suite of tools for analyzing reticulate evolutionary relationships, or evolutionary networks, which are rooted, directed, acyclic graphs, leaf-labeled by a set of taxa. These tools can be classified into four categories: (1 evolutionary network representation: reading/writing evolutionary networks in a newly devised compact form; (2 evolutionary network characterization: analyzing evolutionary networks in terms of three basic building blocks – trees, clusters, and tripartitions; (3 evolutionary network comparison: comparing two evolutionary networks in terms of topological dissimilarities, as well as fitness to sequence evolution under a maximum parsimony criterion; and (4 evolutionary network reconstruction: reconstructing an evolutionary network from a species tree and a set of gene trees. Conclusion The software package, PhyloNet, offers an array of utilities to allow for efficient and accurate analysis of evolutionary networks. The software package will help significantly in

  19. PhyloNet: a software package for analyzing and reconstructing reticulate evolutionary relationships.

    Science.gov (United States)

    Than, Cuong; Ruths, Derek; Nakhleh, Luay

    2008-07-28

    Phylogenies, i.e., the evolutionary histories of groups of taxa, play a major role in representing the interrelationships among biological entities. Many software tools for reconstructing and evaluating such phylogenies have been proposed, almost all of which assume the underlying evolutionary history to be a tree. While trees give a satisfactory first-order approximation for many families of organisms, other families exhibit evolutionary mechanisms that cannot be represented by trees. Processes such as horizontal gene transfer (HGT), hybrid speciation, and interspecific recombination, collectively referred to as reticulate evolutionary events, result in networks, rather than trees, of relationships. Various software tools have been recently developed to analyze reticulate evolutionary relationships, which include SplitsTree4, LatTrans, EEEP, HorizStory, and T-REX. In this paper, we report on the PhyloNet software package, which is a suite of tools for analyzing reticulate evolutionary relationships, or evolutionary networks, which are rooted, directed, acyclic graphs, leaf-labeled by a set of taxa. These tools can be classified into four categories: (1) evolutionary network representation: reading/writing evolutionary networks in a newly devised compact form; (2) evolutionary network characterization: analyzing evolutionary networks in terms of three basic building blocks - trees, clusters, and tripartitions; (3) evolutionary network comparison: comparing two evolutionary networks in terms of topological dissimilarities, as well as fitness to sequence evolution under a maximum parsimony criterion; and (4) evolutionary network reconstruction: reconstructing an evolutionary network from a species tree and a set of gene trees. The software package, PhyloNet, offers an array of utilities to allow for efficient and accurate analysis of evolutionary networks. The software package will help significantly in analyzing large data sets, as well as in studying the

  20. Modularized seismic full waveform inversion based on waveform sensitivity kernels - The software package ASKI

    Science.gov (United States)

    Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel

    2015-04-01

    We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity

  1. Dose - a software package for the calculation of integrated exposure resulting from an accident in a nuclear power plant

    International Nuclear Information System (INIS)

    Doron, E.; Ohaion, H.; Asculai, E.

    1985-05-01

    A software package intended for the assessments of risks resulting from accidental release of radioactive materials from a nuclear power plant is presented. The models and the various programs based on them, are described. The work includes detailed operating instructions for the various programs, as well as instructions for the preparation of the necessary input data. Various options are described for additions and changes to the programs with the aim of extending their usefulness to more general cases from the aspects of meteorology and pollution sources. finally, a sample calculation that enables the user to test the proper functioning of the whole package, as well as his own proficiency in its use, is given. (author)

  2. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  3. A MATLAB Package for Markov Chain Monte Carlo with a Multi-Unidimensional IRT Model

    Directory of Open Access Journals (Sweden)

    Yanyan Sheng

    2008-11-01

    Full Text Available Unidimensional item response theory (IRT models are useful when each item is designed to measure some facet of a unified latent trait. In practical applications, items are not necessarily measuring the same underlying trait, and hence the more general multi-unidimensional model should be considered. This paper provides the requisite information and description of software that implements the Gibbs sampler for such models with two item parameters and a normal ogive form. The software developed is written in the MATLAB package IRTmu2no. The package is flexible enough to allow a user the choice to simulate binary response data with multiple dimensions, set the number of total or burn-in iterations, specify starting values or prior distributions for model parameters, check convergence of the Markov chain, as well as obtain Bayesian fit statistics. Illustrative examples are provided to demonstrate and validate the use of the software package.

  4. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    Science.gov (United States)

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  5. Improving package structure of object-oriented software using multi-objective optimization and weighted class connections

    Directory of Open Access Journals (Sweden)

    Amarjeet

    2017-07-01

    Full Text Available The software maintenance activities performed without following the original design decisions about the package structure usually deteriorate the quality of software modularization, leading to decay of the quality of the system. One of the main reasons for such structural deterioration is inappropriate grouping of source code classes in software packages. To improve such grouping/modular-structure, previous researchers formulated the software remodularization problem as an optimization problem and solved it using search-based meta-heuristic techniques. These optimization approaches aimed at improving the quality metrics values of the structure without considering the original package design decisions, often resulting into a totally new software modularization. The entirely changed software modularization becomes costly to realize as well as difficult to understand for the developers/maintainers. To alleviate this issue, we propose a multi-objective optimization approach to improve the modularization quality of an object-oriented system with minimum possible movement of classes between existing packages of original software modularization. The optimization is performed using NSGA-II, a widely-accepted multi-objective evolutionary algorithm. In order to ensure minimum modification of original package structure, a new approach of computing class relations using weighted strengths has been proposed here. The weights of relations among different classes are computed on the basis of the original package structure. A new objective function has been formulated using these weighted class relations. This objective function drives the optimization process toward better modularization quality simultaneously ensuring preservation of original structure. To evaluate the results of the proposed approach, a series of experiments are conducted over four real-worlds and two random software applications. The experimental results clearly indicate the effectiveness

  6. SWISTRACK - AN OPEN SOURCE, SOFTWARE PACKAGE APPLICABLE TO TRACKING OF FISH LOCOMOTION AND BEHAVIOUR

    DEFF Research Database (Denmark)

    Steffensen, John Fleng

    2010-01-01

    effective background subtraction algorithms and filters ensuring smooth tracking of fish • Application of tags of different colour enables the software to track multiple fish without the problem of track exchange between individuals • Low processing requirements enable tracking in real-time • Further...... benefits of the software relate to its open source code. Users competent in C++ can readily modify and create their own tracking protocols suited to their own custom designed experimental setups. These benefits will be demonstrated with the presentation of supplementary visual mtormation....... including swimming speed, acceleration and directionality of movements as well as the examination of locomotory panems during swimming. SWiSlrdL:k, a [n: t; and downloadable software package (available from www.sourceforge.com) is widely used for tracking robots, humans and other animals. Accordingly...

  7. Software package evaluation for the TJ-II Data Acquisition System

    International Nuclear Information System (INIS)

    Cremy, C.; Sanchez, E.; Portas, A.; Vega, J.

    1996-01-01

    The TJ-II Data Acquisition System (DAS) has to provide a user interface which will allow setup for sampling channels, discharge signal visualization and reduce data processing, all in run time. On the other hand, the DAS will provide a high level software capability for signal analysis, processing and data visualization either in run time or off line. A set of software packages including Builder Xcessory, X-designer, llog Builder, Toolmaster, AVS 5, AVS/Express, PV-WAVE and Iris Explorer, have been evaluated by the Data Acquisition Group of the Fusion Division. the software evaluation, resumed in this paper, has resulted in a global solution being found which meets all of the DAS requirements. (Author)

  8. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    densities); converting PSDs to order analysis data; extracting harmonics; initializing and simultaneously tuning a harmonic model and a wheel structural model; initializing and tuning a broadband model; and verifying the harmonic/broadband/structural model against the measurement data. Functional operation is through a MATLAB GUI that loads test data, performs the various analyses, plots evaluation data for assessment and refinement of analysis parameters, and exports the data to documentation or downstream analysis code. The harmonic models are defined as specified functions of frequency, typically speed-squared. The reaction wheel structural model is realized as mass, damping, and stiffness matrices (typically from a finite element analysis package) with the addition of a gyroscopic forcing matrix. The broadband noise model is realized as a set of speed-dependent filters. The tuning of the combined model is performed using nonlinear least squares techniques. RWDMES is implemented as a MATLAB toolbox comprising the Fit Manager for performing the model extraction, Data Manager for managing input data and output models, the Gyro Manager for modifying wheel structural models, and the Harmonic Editor for evaluating and tuning harmonic models. This software was validated using data from Goodrich E wheels, and from GSFC Lunar Reconnaissance Orbiter (LRO) wheels. The validation testing proved that RWDMES has the capability to extract accurate disturbance models from flight reaction wheels with minimal user effort.

  9. Analysing the Zenith Tropospheric Delay Estimates in On-line Precise Point Positioning (PPP Services and PPP Software Packages

    Directory of Open Access Journals (Sweden)

    Jorge Mendez Astudillo

    2018-02-01

    Full Text Available As Global Navigation Satellite System (GNSS signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.

  10. TENSOLVE: A software package for solving systems of nonlinear equations and nonlinear least squares problems using tensor methods

    Energy Technology Data Exchange (ETDEWEB)

    Bouaricha, A. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.; Schnabel, R.B. [Colorado Univ., Boulder, CO (United States). Dept. of Computer Science

    1996-12-31

    This paper describes a modular software package for solving systems of nonlinear equations and nonlinear least squares problems, using a new class of methods called tensor methods. It is intended for small to medium-sized problems, say with up to 100 equations and unknowns, in cases where it is reasonable to calculate the Jacobian matrix or approximate it by finite differences at each iteration. The software allows the user to select between a tensor method and a standard method based upon a linear model. The tensor method models F({ital x}) by a quadratic model, where the second-order term is chosen so that the model is hardly more expensive to form, store, or solve than the standard linear model. Moreover, the software provides two different global strategies, a line search and a two- dimensional trust region approach. Test results indicate that, in general, tensor methods are significantly more efficient and robust than standard methods on small and medium-sized problems in iterations and function evaluations.

  11. Integrated performance assessment model for waste policy package behavior and radionuclide release

    International Nuclear Information System (INIS)

    Kossik, R.; Miller, I.; Cunnane, M.

    1992-01-01

    Golder Associates Inc. (GAI) has developed a probabilistic total system performance assessment and strategy evaluation model (RIP) which can be applied in an iterative manner to evaluate repository site suitability and guide site characterization. This paper describes one component of the RIP software, the waste package behavior and radionuclide release model. The waste package component model considers waste package failure by various modes, matrix alteration/dissolution, and radionuclide mass transfer. Model parameters can be described as functions of local environmental conditions. The waste package component model is coupled to component models for far-field radionuclide transport and disruptive events. The model has recently been applied to the proposed repository at Yucca Mountain

  12. Integrated performance assessment model for waste package behavior and radionuclide release

    International Nuclear Information System (INIS)

    Kossik, R.; Miller, I.; Cunnane, M.

    1992-01-01

    Golder Associates Inc. (GAI) has developed a probabilistic total system performance assessment and strategy evaluation model (RIP) which can be applied in an iterative manner to evaluate repository site suitability and guide site characterization. This paper describes one component of the RIP software, the waste package behavior and radionuclide release model. The waste package component model considers waste package failure by various modes, matrix alteration/dissolution, and radionuclide mass transfer. Model parameters can be described as functions of local environmental conditions. The waste package component model is coupled to component models for far-field radionuclide transport and disruptive events. The model has recently been applied to the proposed repository at Yucca Mountain

  13. VIPEX (Vital-area Identification Package EXpert) Software Verification and Validation

    International Nuclear Information System (INIS)

    Jung, Woo Sik; Suh, Jae Seung

    2010-06-01

    The purposes of this report are (1) to perform a Verification and Validation (V and V) test for the VIPEX(Vital-area Identification Package EXpert) software and (2) to improve a software quality through the V and V test. The VIPEX was developed in Korea Atomic Energy Research Institute (KAERI) for the Vital Area Identification (VAI) of nuclear power plants. The version of the VIPEX which was distributed is 3.2.0.0. The VIPEX was revised based on the first V and V test and the second V and V test was performed. We have performed the following tasks for the V and V test on Windows XP and VISTA operating systems: Ο Testing basic functions including fault tree editing Ο Testing all kind of functions Ο Research for update from Visual BASIC 6.0 to Visual BASIC 2008

  14. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Science.gov (United States)

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. TensorPack: a Maple-based software package for the manipulation of algebraic expressions of tensors in general relativity

    International Nuclear Information System (INIS)

    Huf, P A; Carminati, J

    2015-01-01

    In this paper we: (1) introduce TensorPack, a software package for the algebraic manipulation of tensors in covariant index format in Maple; (2) briefly demonstrate the use of the package with an orthonormal tensor proof of the shearfree conjecture for dust. TensorPack is based on the Riemann and Canon tensor software packages and uses their functions to express tensors in an indexed covariant format. TensorPack uses a string representation as input and provides functions for output in index form. It extends the functionality to basic algebra of tensors, substitution, covariant differentiation, contraction, raising/lowering indices, symmetry functions and other accessory functions. The output can be merged with text in the Maple environment to create a full working document with embedded dynamic functionality. The package offers potential for manipulation of indexed algebraic tensor expressions in a flexible software environment. (paper)

  16. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  17. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  18. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    International Nuclear Information System (INIS)

    Tuura, L.A.; Taylor, L.

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  19. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  20. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    Science.gov (United States)

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  1. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  2. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Directory of Open Access Journals (Sweden)

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  3. SeDA: A software package for the statistical analysis of the instrument drift

    International Nuclear Information System (INIS)

    Lee, H. J.; Jang, S. C.; Lim, T. J.

    2006-01-01

    The setpoints for safety-related equipment are affected by many sources of an uncertainty. ANSI/ISA-S67.04.01-2000 [1] and ISA-RP6 7.04.02-2000 [2] suggested the statistical approaches for ensuring that the safety-related instrument setpoints were established and maintained within the technical specification limits [3]. However, Jang et al. [4] indicated that the preceding methodologies for a setpoint drift analysis might be insufficient to manage a setpoint drift on an instrumentation device and proposed new statistical analysis procedures for the management of a setpoint drift, based on the plant specific as-found/as-left data. Although IHPA (Instrument History Performance Analysis) is a widely known commercial software package to analyze an instrument setpoint drift, several steps in the new procedure cannot be performed by using it because it is based on the statistical approaches suggested in the ANSI/ISA-S67.04.01 -2000 [1] and ISA-RP67.04.02-2000 [2], In this paper we present a software package (SeDA: Setpoint Drift Analysis) that implements new methodologies, and which is easy to use, as it is accompanied by powerful graphical tools. (authors)

  4. [Development of the software package of the nuclear medicine data processor for education and research].

    Science.gov (United States)

    Maeda, Hisato; Yamaki, Noriyasu; Azuma, Makoto

    2012-01-01

    The objective of this study was to develop a personal computer-based nuclear medicine data processor for education and research in the field of nuclear medicine. We call this software package "Prominence Processor" (PP). Windows of Microsoft Corporation was used as the operating system of this PP, which have 1024 × 768 image resolution and various 63 applications classified into 6 groups. The accuracy was examined for a lot of applications of the PP. For example, in the FBP reconstruction application, there was visually no difference in the image quality as a result of comparing two SPECT images obtained from the PP and GMS-5500A (Toshiba). Moreover, Normalized MSE between both images showed 0.0003. Therefore the high processing accuracy of the FBP reconstruction application was proven as well as other applications. The PP can be used in an arbitrary place if the software package is installed in note PC. Therefore the PP is used to lecture and to practice on an educational site and used for the purpose of the research of the radiological technologist on a clinical site etc. widely now.

  5. REIDAC. A software package for retrospective dose assessment in internal contamination with radionuclides

    International Nuclear Information System (INIS)

    Kurihara, Osamu; Kanai, Katsuta; Takada, Chie; Takasaki, Koji; Ito, Kimio; Momose, Takumaro; Hato, Shinji; Ikeda, Hiroshi; Oeda, Mikihiro; Kurosawa, Naohiro; Fukutsu, Kumiko; Yamada, Yuji; Akashi, Makoto

    2007-01-01

    For cases of internal contamination with radionuclides, it is necessary to perform an internal dose assessment to facilitate radiation protection. For this purpose, the ICRP has supplied the dose coefficients and the retention and excretion rates for various radionuclides. However, these dosimetric quantities are calculated under typical conditions and are not necessarily detailed enough for dose assessment situations in which specific information on the incident or/and individual biokinetic characteristics could or should be taken into account retrospectively. This paper describes a newly developed PC-based software package called Retrospective Internal Dose Assessment Code (REIDAC) that meets the needs of retrospective dose assessment. REIDAC is made up of a series of calculation programs and a package of software. The former calculates the dosimetric quantities for any radionuclide being assessed and the latter provides a user with the graphical user interface (GUI) for executing the programs, editing parameter values and displaying results. The accuracy of REIDAC was verified by comparisons with dosimetric quantities given in the ICRP publications. This paper presents the basic structure of REIDAC and its calculation methods. Sensitivity analysis of the aerosol size for 239 Pu compounds and provisional calculations for wound contamination with 241 Am were performed as examples of the practical application of REIDAC. (author)

  6. METEOR v1.0 - Design and structure of the software package

    International Nuclear Information System (INIS)

    Palomo, E.

    1994-01-01

    This script describes the structure and the separated modules of the software package METEOR for the statistical analysis of meteorological data series. It contains a systematic description of the subroutines of METEOR and, also, of the required shape for input and output files. The original version of METEOR have been developed by Ph.D. Elena Palomo, CIEMAT-IER, GIMASE. It is built by linking programs and routines written in FORTRAN 77 and it adds thc graphical capabilities of GNUPLOT. The shape of this toolbox was designed following the criteria of modularity, flexibility and agility criteria. All the input, output and analysis options are structured in three main menus: i) the first is aimed to evaluate the quality of the data set; ii) the second is aimed for pre-processing of the data; and iii) the third is aimed towards the statistical analyses and for creating the graphical outputs. Actually the information about METEOR is constituted by three documents written in spanish: 1) METEOR v1.0: User's guide; 2) METEOR v1.0: A usage example; 3) METEOR v 1.0: Design and structure of the software package. (Author)

  7. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: II. Algorithms.

    Science.gov (United States)

    Appel, R D; Vargas, J R; Palagi, P M; Walther, D; Hochstrasser, D F

    1997-12-01

    After two generations of software systems for the analysis of two-dimensional electrophoresis (2-DE) images, a third generation of such software packages has recently emerged that combines state-of-the-art graphical user interfaces with comprehensive spot data analysis capabilities. A key characteristic common to most of these software packages is that many of their tools are implementations of algorithms that resulted from research areas such as image processing, vision, artificial intelligence or machine learning. This article presents the main algorithms implemented in the Melanie II 2-D PAGE software package. The applications of these algorithms, embodied as the feature of the program, are explained in an accompanying article (R. D. Appel et al.; Electrophoresis 1997, 18, 2724-2734).

  8. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  9. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  11. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  12. ATOMIC AND MOLECULAR PHYSICS: Modelling of a DNA packaging motor

    Science.gov (United States)

    Qian, Jun; Xie, Ping; Xue, Xiao-Guang; Wang, Peng-Ye

    2009-11-01

    During the assembly of many viruses, a powerful molecular motor packages the genome into a preassembled capsid. The Bacillus subtilis phage phi29 is an excellent model system to investigate the DNA packaging mechanism because of its highly efficient in vitro DNA packaging activity and the development of a single-molecule packaging assay. Here we make use of structural and biochemical experimental data to build a physical model of DNA packaging by the phi29 DNA packaging motor. Based on the model, various dynamic behaviours such as the packaging rate, pause frequency and slip frequency under different ATP concentrations, ADP concentrations, external loads as well as capsid fillings are studied by using Monte Carlo simulation. Good agreement is obtained between the simulated and available experimental results. Moreover, we make testable predictions that should guide future experiments related to motor function.

  13. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    International Nuclear Information System (INIS)

    Hernandez, F.; Gonzalez-Manrique, S.; Karlsson, L.; Hernandez-Armas, J.; Aparicio, A.

    2007-01-01

    Makrofol detectors are commonly used for long-term radon ( 222 Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm -3 htrack -1 cm 2 , has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm -3 and, in two of them, above 400Bqm -3 . Further studies should be performed at those schools following the European Union recommendations about radon concentrations in buildings

  14. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  15. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    Science.gov (United States)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  16. Neural Network Program Package for Prosody Modeling

    Directory of Open Access Journals (Sweden)

    J. Santarius

    2004-04-01

    Full Text Available This contribution describes the programme for one part of theautomatic Text-to-Speech (TTS synthesis. Some experiments (for example[14] documented the considerable improvement of the naturalness ofsynthetic speech, but this approach requires completing the inputfeature values by hand. This completing takes a lot of time for bigfiles. We need to improve the prosody by other approaches which useonly automatically classified features (input parameters. Theartificial neural network (ANN approach is used for the modeling ofprosody parameters. The program package contains all modules necessaryfor the text and speech signal pre-processing, neural network training,sensitivity analysis, result processing and a module for the creationof the input data protocol for Czech speech synthesizer ARTIC [1].

  17. Integrated software package for nuclear material safeguards in a MOX fuel fabrication facility

    International Nuclear Information System (INIS)

    Schreiber, H.J.; Piana, M.; Moussalli, G.; Saukkonen, H.

    2000-01-01

    Since computerized data processing was introduced to Safeguards at large bulk handling facilities, a large number of individual software applications have been developed for nuclear material Safeguards implementation. Facility inventory and flow data are provided in computerized format for performing stratification, sample size calculation and selection of samples for destructive and non-destructive assay. Data is collected from nuclear measurement systems running in attended, unattended mode and more recently from remote monitoring systems controlled. Data sets from various sources have to be evaluated for Safeguards purposes, such as raw data, processed data and conclusions drawn from data evaluation results. They are reported in computerized format at the International Atomic Energy Agency headquarters and feedback from the Agency's mainframe computer system is used to prepare and support Safeguards inspection activities. The integration of all such data originating from various sources cannot be ensured without the existence of a common data format and a database system. This paper describes the fundamental relations between data streams, individual data processing tools, data evaluation results and requirements for an integrated software solution to facilitate nuclear material Safeguards at a bulk handling facility. The paper also explains the basis for designing a software package to manage data streams from various data sources and for incorporating diverse data processing tools that until now have been used independently from each other and under different computer operating systems. (author)

  18. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Directory of Open Access Journals (Sweden)

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  19. LipiDex: An Integrated Software Package for High-Confidence Lipid Identification.

    Science.gov (United States)

    Hutchins, Paul D; Russell, Jason D; Coon, Joshua J

    2018-04-17

    State-of-the-art proteomics software routinely quantifies thousands of peptides per experiment with minimal need for manual validation or processing of data. For the emerging field of discovery lipidomics via liquid chromatography-tandem mass spectrometry (LC-MS/MS), comparably mature informatics tools do not exist. Here, we introduce LipiDex, a freely available software suite that unifies and automates all stages of lipid identification, reducing hands-on processing time from hours to minutes for even the most expansive datasets. LipiDex utilizes flexible in silico fragmentation templates and lipid-optimized MS/MS spectral matching routines to confidently identify and track hundreds of lipid species and unknown compounds from diverse sample matrices. Unique spectral and chromatographic peak purity algorithms accurately quantify co-isolation and co-elution of isobaric lipids, generating identifications that match the structural resolution afforded by the LC-MS/MS experiment. During final data filtering, ionization artifacts are removed to significantly reduce dataset redundancy. LipiDex interfaces with several LC-MS/MS software packages, enabling robust lipid identification to be readily incorporated into pre-existing data workflows. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. A SYSTEMATIC STUDY OF SOFTWARE QUALITY MODELS

    OpenAIRE

    Dr.Vilas. M. Thakare; Ashwin B. Tomar

    2011-01-01

    This paper aims to provide a basis for software quality model research, through a systematic study ofpapers. It identifies nearly seventy software quality research papers from journals and classifies paper asper research topic, estimation approach, study context and data set. The paper results combined withother knowledge provides support for recommendations in future software quality model research, toincrease the area of search for relevant studies, carefully select the papers within a set ...

  1. Plutonium air transportable package Model PAT-1. Safety analysis report

    International Nuclear Information System (INIS)

    1978-02-01

    The document is a Safety Analysis Report for the Plutonium Air Transportable Package, Model PAT-1, which was developed by Sandia Laboratories under contract to the Nuclear Regulatory Commission (NRC). The document describes the engineering tests and evaluations that the NRC staff used as a basis to determine that the package design meets the requirements specified in the NRC ''Qualification Criteria to Certify a Package for Air Transport of Plutonium'' (NUREG-0360). By virtue of its ability to meet the NRC Qualification Criteria, the package design is capable of safely withstanding severe aircraft accidents. The document also includes engineering drawings and specifications for the package. 92 figs, 29 tables

  2. PCG: A software package for the iterative solution of linear systems on scalar, vector and parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, W. [Los Alamos National Lab., NM (United States); Carey, G.F. [Univ. of Texas, Austin, TX (United States)

    1994-12-31

    A great need exists for high performance numerical software libraries transportable across parallel machines. This talk concerns the PCG package, which solves systems of linear equations by iterative methods on parallel computers. The features of the package are discussed, as well as techniques used to obtain high performance as well as transportability across architectures. Representative numerical results are presented for several machines including the Connection Machine CM-5, Intel Paragon and Cray T3D parallel computers.

  3. DISPL: a software package for one and two spatially dimensioned kinetics-diffusion problems. [FORTRAN for IBM computers

    Energy Technology Data Exchange (ETDEWEB)

    Leaf, G K; Minkoff, M; Byrne, G D; Sorensen, D; Bleakney, T; Saltzman, J

    1978-11-01

    DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous media. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 17 figures, 9 tables.

  4. Package

    Directory of Open Access Journals (Sweden)

    Arsić Zoran

    2013-01-01

    Full Text Available It is duty of the seller to pack the goods in a manner which assures their safe arrival and enables their handling in transit and at the place of destination. The problem of packing is relevant in two main respects. First of all the buyer is in certain circumstances entitled to refuse acceptance of the goods if they are not properly packed. Second, the package is relevant to calculation of price and freight based on weight. In the case of export trade, the package should conform to the legislation in the country of destination. The impact of package on environment is regulated by environment protection regulation of Republic if Serbia.

  5. Software package to automate the design and production of translucent building structures made of pvc

    Directory of Open Access Journals (Sweden)

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  6. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  7. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  8. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  9. SemiMarkov: An R Package for Parametric Estimation in Multi-State Semi-Markov Models

    OpenAIRE

    Listwon, Agnieszka; Saint-Pierre, Philippe

    2015-01-01

    Multi-state models provide a relevant tool for studying the observations of a continuous-time process at arbitrary times. Markov models are often considered even if semi-Markov are better adapted in various situations. Such models are still not frequently applied mainly due to lack of available software. We have developed the R package SemiMarkov to fit homogeneous semi-Markov models to longitudinal data. The package performs maximum likelihood estimation in a parametric framework where the d...

  10. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  11. MPA Portable: A Stand-Alone Software Package for Analyzing Metaproteome Samples on the Go.

    Science.gov (United States)

    Muth, Thilo; Kohrs, Fabian; Heyer, Robert; Benndorf, Dirk; Rapp, Erdmann; Reichl, Udo; Martens, Lennart; Renard, Bernhard Y

    2018-01-02

    Metaproteomics, the mass spectrometry-based analysis of proteins from multispecies samples faces severe challenges concerning data analysis and results interpretation. To overcome these shortcomings, we here introduce the MetaProteomeAnalyzer (MPA) Portable software. In contrast to the original server-based MPA application, this newly developed tool no longer requires computational expertise for installation and is now independent of any relational database system. In addition, MPA Portable now supports state-of-the-art database search engines and a convenient command line interface for high-performance data processing tasks. While search engine results can easily be combined to increase the protein identification yield, an additional two-step workflow is implemented to provide sufficient analysis resolution for further postprocessing steps, such as protein grouping as well as taxonomic and functional annotation. Our new application has been developed with a focus on intuitive usability, adherence to data standards, and adaptation to Web-based workflow platforms. The open source software package can be found at https://github.com/compomics/meta-proteome-analyzer .

  12. The instrument control software package for the Habitable-Zone Planet Finder spectrometer

    Science.gov (United States)

    Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan

    2016-08-01

    We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.

  13. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Science.gov (United States)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  14. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  15. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system....... Essential features of the model have been implemented in a research prototype, Ragnarok. Two years of experience using Ragnarok in three, real, small- to medium sized, projects is reported. The conclusion is that the presented model is viable, feels 'natural' for developers, and provides good support...

  16. ddradseqtools: a software package for in silico simulation and testing of double-digest RADseq experiments.

    Science.gov (United States)

    Mora-Márquez, F; García-Olivares, V; Emerson, B C; López de Heredia, U

    2017-03-01

    Double-digested RADseq (ddRADseq) is a NGS methodology that generates reads from thousands of loci targeted by restriction enzyme cut sites, across multiple individuals. To be statistically sound and economically optimal, a ddRADseq experiment has a preliminary design stage that needs to consider issues related to the selection of enzymes, particular features of the genome of the focal species, possible modifications to the library construction protocol, coverage needed to minimize missing data, and the potential sources of error that may impact upon the coverage. We present ddradseqtools, a software package to help ddRADseq experimental design by (i) the generation of in silico double-digested fragments; (ii) the construction of modified ddRADseq libraries using adapters with either one or two indexes and degenerate base regions (DBRs) to quantify PCR duplicates; and (iii) the initial steps of the bioinformatics preprocessing of reads. ddradseqtools generates single-end (SE) or paired-end (PE) reads that may bear SNPs and/or indels. The effect of allele dropout and PCR duplicates on coverage is also simulated. The resulting output files can be submitted to pipelines of alignment and variant calling, to allow the fine-tuning of parameters. The software was validated with specific tests for the correct operability of the program. The correspondence between in silico settings and parameters from ddRADseq in vitro experiments was assessed to provide guidelines for the reliable performance of the software. ddradseqtools is cost-efficient in terms of execution time, and can be run on computers with standard CPU and RAM configuration. © 2016 John Wiley & Sons Ltd.

  17. A flexible modelling software for data acquisition

    International Nuclear Information System (INIS)

    Shu Yantai; Chen Yanhui; Yang Songqi; Liu Genchen

    1992-03-01

    A flexible modelling software for data acquisition is based on an event-driven simulator. It can be used to simulate a wide variety of systems which can be modelled as open queuing networks. The main feature of the software is its flexibility to evaluate the performance of various data acquisition system, whether pulsed or not. The flexible features of this software as follow: The user can choose the number of processors in the model and the route which every job takes to move the model. the service rate of a processor is automatically adapted. The simulator has a pipe-line mechanism. A job can be divided into several segments and a processor may be used as a compression component etc. Some modelling techniques and applications of this software in plasma physics laboratories are also presented

  18. topicmodels: An R Package for Fitting Topic Models

    Directory of Open Access Journals (Sweden)

    Bettina Grun

    2011-05-01

    Full Text Available Topic models allow the probabilistic modeling of term frequency occurrences in documents. The fitted model can be used to estimate the similarity between documents as well as between a set of specified keywords using an additional layer of latent variables which are referred to as topics. The R package topicmodels provides basic infrastructure for fitting topic models based on data structures from the text mining package tm. The package includes interfaces to two algorithms for fitting topic models: the variational expectation-maximization algorithm provided by David M. Blei and co-authors and an algorithm using Gibbs sampling by Xuan-Hieu Phan and co-authors.

  19. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  20. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  1. Power Electronic Packaging Design, Assembly Process, Reliability and Modeling

    CERN Document Server

    Liu, Yong

    2012-01-01

    Power Electronic Packaging presents an in-depth overview of power electronic packaging design, assembly,reliability and modeling. Since there is a drastic difference between IC fabrication and power electronic packaging, the book systematically introduces typical power electronic packaging design, assembly, reliability and failure analysis and material selection so readers can clearly understand each task's unique characteristics. Power electronic packaging is one of the fastest growing segments in the power electronic industry, due to the rapid growth of power integrated circuit (IC) fabrication, especially for applications like portable, consumer, home, computing and automotive electronics. This book also covers how advances in both semiconductor content and power advanced package design have helped cause advances in power device capability in recent years. The author extrapolates the most recent trends in the book's areas of focus to highlight where further improvement in materials and techniques can d...

  2. A Predictive Safety Management System Software Package Based on the Continuous Hazard Tracking and Failure Prediction Methodology

    Science.gov (United States)

    Quintana, Rolando

    2003-01-01

    The goal of this research was to integrate a previously validated and reliable safety model, called Continuous Hazard Tracking and Failure Prediction Methodology (CHTFPM), into a software application. This led to the development of a safety management information system (PSMIS). This means that the theory or principles of the CHTFPM were incorporated in a software package; hence, the PSMIS is referred to as CHTFPM management information system (CHTFPM MIS). The purpose of the PSMIS is to reduce the time and manpower required to perform predictive studies as well as to facilitate the handling of enormous quantities of information in this type of studies. The CHTFPM theory encompasses the philosophy of looking at the concept of safety engineering from a new perspective: from a proactive, than a reactive, viewpoint. That is, corrective measures are taken before a problem instead of after it happened. That is why the CHTFPM is a predictive safety because it foresees or anticipates accidents, system failures and unacceptable risks; therefore, corrective action can be taken in order to prevent all these unwanted issues. Consequently, safety and reliability of systems or processes can be further improved by taking proactive and timely corrective actions.

  3. Modeling and Selection of Software Service Variants

    OpenAIRE

    Wittern, John Erik

    2015-01-01

    Providers and consumers have to deal with variants, meaning alternative instances of a service?s design, implementation, deployment, or operation, when developing or delivering software services. This work presents service feature modeling to deal with associated challenges, comprising a language to represent software service variants and a set of methods for modeling and subsequent variant selection. This work?s evaluation includes a POC implementation and two real-life use cases.

  4. ANALYSIS OF CELLULAR REACTION TO IFN-γ STIMULATION BY A SOFTWARE PACKAGE GeneExpressionAnalyser

    Directory of Open Access Journals (Sweden)

    A. V. Saetchnikov

    2014-01-01

    Full Text Available The software package GeneExpressionAnalyser for analysis of the DNA microarray experi-mental data has been developed. The algorithms of data analysis, differentially expressed genes and biological functions of the cell are described. The efficiency of the developed package is tested on the published experimental data devoted to the time-course research of the changes in the human cell un-der the influence of IFN-γ on melanoma. The developed software has a number of advantages over the existing software: it is free, has a simple and intuitive graphical interface, allows to analyze different types of DNA microarrays, contains a set of methods for complete data analysis and performs effec-tive gene annotation for a selected list of genes.

  5. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images

    NARCIS (Netherlands)

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F.; Lobbezoo, Frank; Aarab, Ghizlane

    2017-01-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G(®) (QR Systems, Verona, Italy) CBCT data sets were

  6. SOFTWARE SOLUTIONS FOR ARDL MODELS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2015-07-01

    Full Text Available VAR type models can be used only for stationary time series. Causality analyses through econometric models need that series to have the same integrated order. Usually, when constraining the series to comply these restrictions (e.g. by differentiating, economic interpretation of the outcomes may become difficult. Recent solution for mitigating these problems is the use of ARDL (autoregressive distributed lag models. We present implementation in E-Views of these models and we test the impact of exchange rate on consumer price index.

  7. Graphical modelling software in R - status

    DEFF Research Database (Denmark)

    Dethlefsen, Claus; Højsgaard, Søren; Lauritzen, Steffen L.

    2007-01-01

    , and Kreiner 1995), MIM (Edwards  2000), and Tetrad (Glymour, Scheines, Spirtes, and Kelley 1987). The gR initiative (Lauritzen 2002) aims at making graphical models available in R (R Development Core Team 2006). A small grant from the Danish Science Foundation supported this initiative. We will summarize...... the results of the initiative so far. Specifically we will illustrate some of the R packages for graphical modelling currently on CRAN and discuss their strengths and weaknesses....

  8. The gRbase Package for Graphical Modelling in R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Dethlefsen, Claus

    We have developed a package, called , consisting of a number of classes and associated methods to support the analysis of data using graphical models. It is developed for the open source language, R, and is available for several platforms. The package is intended to be widely extendible...... and flexible so that package developers may implement further types of graphical models using the available methods. contains methods for representing data, specification of models using a formal language, and is linked to , an interactive graphical user interface for manipulating graphs. We show how...

  9. Constraint Network Analysis (CNA): a Python software package for efficiently linking biomacromolecular structure, flexibility, (thermo-)stability, and function.

    Science.gov (United States)

    Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger

    2013-04-22

    For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.

  10. VIBA-LAB2: a virtual ion beam analysis laboratory software package incorporating elemental map simulations

    International Nuclear Information System (INIS)

    Zhou, S.J.; Orlic, I.; Sanchez, J.L.; Watt, F.

    1999-01-01

    The software package VIBA-lab1, which incorporates PIXE and RBS energy spectra simulation has now been extended to include the simulation of elemental maps from 3D structures. VIBA-lab1 allows the user to define a wide variety of experimental parameters, e.g. energy and species of incident ions, excitation and detection geometry, etc. When the relevant experimental parameters as well as target composition are defined, the program can then simulate the corresponding PIXE and RBS spectra. VIBA-LAB2 has been written with applications in nuclear microscopy in mind. A set of drag-and-drop tools has been incorporated to allow the user to define a three-dimensional sample object of mixed elemental composition. PIXE energy spectra simulations are then carried out on pixel-by-pixel basis and the corresponding intensity distributions or elemental maps can be computed. Several simulated intensity distributions for some 3D objects are demonstrated, and simulations obtained from a simple IC are compared with experimental results

  11. EPILAB: a software package for studies on the prediction of epileptic seizures.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Feldwisch-Drentrup, H; Valderrama, M; Costa, R P; Alvarado-Rojas, C; Nikolopoulos, S; Le Van Quyen, M; Timmer, J; Schelter, B; Dourado, A

    2011-09-15

    A Matlab®-based software package, EPILAB, was developed for supporting researchers in performing studies on the prediction of epileptic seizures. It provides an intuitive and convenient graphical user interface. Fundamental concepts that are crucial for epileptic seizure prediction studies were implemented. This includes, for example, the development and statistical validation of prediction methodologies in long-term continuous recordings. Seizure prediction is usually based on electroencephalography (EEG) and electrocardiography (ECG) signals. EPILAB is able to process both EEG and ECG data stored in different formats. More than 35 time and frequency domain measures (features) can be extracted based on univariate and multivariate data analysis. These features can be post-processed and used for prediction purposes. The predictions may be conducted based on optimized thresholds or by applying classifications methods such as artificial neural networks, cellular neuronal networks, and support vector machines. EPILAB proved to be an efficient tool for seizure prediction, and aims to be a way to communicate, evaluate, and compare results and data among the seizure prediction community. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. MulRF: a software package for phylogenetic analysis using multi-copy gene trees.

    Science.gov (United States)

    Chaudhary, Ruchi; Fernández-Baca, David; Burleigh, John Gordon

    2015-02-01

    MulRF is a platform-independent software package for phylogenetic analysis using multi-copy gene trees. It seeks the species tree that minimizes the Robinson-Foulds (RF) distance to the input trees using a generalization of the RF distance to multi-labeled trees. The underlying generic tree distance measure and fast running time make MulRF useful for inferring phylogenies from large collections of gene trees, in which multiple evolutionary processes as well as phylogenetic error may contribute to gene tree discord. MulRF implements several features for customizing the species tree search and assessing the results, and it provides a user-friendly graphical user interface (GUI) with tree visualization. The species tree search is implemented in C++ and the GUI in Java Swing. MulRF's executable as well as sample datasets and manual are available at http://genome.cs.iastate.edu/CBL/MulRF/, and the source code is available at https://github.com/ruchiherself/MulRFRepo. ruchic@ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. TreeTime: an extensible C++ software package for Bayesian phylogeny reconstruction with time-calibration.

    Science.gov (United States)

    Himmelmann, Lin; Metzler, Dirk

    2009-09-15

    For the estimation of phylogenetic trees from molecular data, it is worthwhile to take prior paleontologic knowledge into account, if available. To calibrate the branch lengths of the tree with times assigned to geo-historical events or fossils, it is necessary to select a relaxed molecular clock model to specify how mutation rates can change along the phylogeny. We present the software TreeTime for Bayesian phylogeny estimation. It can take prior information about the topology of the tree and about branching times into account. Several relaxed molecular clock models are implemented in TreeTime. TreeTime is written in C++ and designed to be efficient and extensible. TreeTime is freely available from http://evol.bio.lmu.de/statgen/software/treetime under the terms of the GNU General Public Licence (GPL, version 3 or later).

  14. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    International Nuclear Information System (INIS)

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z

    2011-01-01

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 μm each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  15. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  16. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  18. Management models in the NZ software industry

    Directory of Open Access Journals (Sweden)

    Holger Spill

    Full Text Available This research interviewed eight innovative New Zealand software companies to find out how they manage new product development. It looked at how management used standard techniques of software development to manage product uncertainty through the theoretical lens of the Cyclic Innovation Model. The study found that while there is considerable variation, the management of innovation was largely determined by the level of complexity. Organizations with complex innovative software products had a more iterative software development style, more flexible internal processes and swifter decision-making. Organizations with less complexity in their products tended to use more formal structured approaches. Overall complexity could be inferred with reference to four key factors within the development environment.

  19. Software engineering the mixed model for genome-wide association studies on large samples.

    Science.gov (United States)

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  20. inventory management, VMI, software agents, MDV model

    Directory of Open Access Journals (Sweden)

    Waldemar Wieczerzycki

    2012-03-01

    Full Text Available Background: As it is well know, the implementation of instruments of logistics management is only possible with the use of the latest information technology. So-called agent technology is one of the most promising solutions in this area. Its essence consists in an entirely new way of software distribution on the computer network platform, in which computer exchange among themselves not only data, but also software modules, called just agents. The first aim is to propose the alternative method of the implementation of the concept of the inventory management by the supplier with the use of intelligent software agents, which are able not only to transfer the information but also to make the autonomous decisions based on the privileges given to them. The second aim of this research was to propose a new model of a software agent, which will be both of a high mobility and a high intelligence. Methods: After a brief discussion of the nature of agent technology, the most important benefits of using it to build platforms to support business are given. Then the original model of polymorphic software agent, called Multi-Dimensionally Versioned Software Agent (MDV is presented, which is oriented on the specificity of IT applications in business. MDV agent is polymorphic, which allows the transmission through the network only the most relevant parts of its code, and only when necessary. Consequently, the network nodes exchange small amounts of software code, which ensures high mobility of software agents, and thus highly efficient operation of IT platforms built on the proposed model. Next, the adaptation of MDV software agents to implementation of well-known logistics management instrument - VMI (Vendor Managed Inventory is illustrated. Results: The key benefits of this approach are identified, among which one can distinguish: reduced costs, higher flexibility and efficiency, new functionality - especially addressed to business negotiation, full automation

  1. Quantitative myocardial-perfusion SPECT: comparison of three state-of-the-art software packages.

    Science.gov (United States)

    Wolak, Arik; Slomka, Piotr J; Fish, Mathews B; Lorenzo, Santiago; Acampa, Wanda; Berman, Daniel S; Germano, Guido

    2008-01-01

    We aimed to compare the automation and diagnostic performance in the detection of coronary artery disease (CAD) of the 4DMSPECT (4DM), Emory Cardiac Toolbox (EMO), and QPS systems for automated quantification of myocardial perfusion. We studied 328 patients referred for rest/stress Tc-99m sestamibi imaging, 140 low-likelihood patients and 188 with angiography. Contours were corrected when necessary. All other processing was fully automated. A 17-segment analysis was performed, and a summed stress score (SSS) > or =4 was considered abnormal. The average SSSs (+/-SD) for 4DM, EMO, and QPS were 10.5 +/- 9.4, 11.1 +/- 8.3, and 10.1 +/- 8.9, respectively (P = .02 for QPS versus EMO). The receiver operator characteristics areas-under-the-curve for the detection of CAD (+/-SEM) were 0.84 +/- 0.03, 0.76 +/- 0.04, and 0.88 +/- 0.03 for 4DM, EMO, and QPS, respectively (P = .001 for QPS versus EMO, and P = .03 for 4DM versus EMO). Normalcy rate was higher for QPS and 4DM versus EMO, at 91% and 94% versus 77%, respectively (P = .02). Sensitivity was higher for QPS (87%) versus 4DM (80%) (P = .045). Specificity was higher for QPS (71%) versus EMO (49%) (P = .01). The accuracy rate was higher for QPS versus 4DM and EMO, at 83% versus 77% and 76%, respectively (P = .05). There are differences in myocardial-perfusion quantification, diagnostic performance, and degree of automation of software packages.

  2. Inventory of data bases, graphics packages, and models in Department of Energy laboratories

    International Nuclear Information System (INIS)

    Shriner, C.R.; Peck, L.J.

    1978-11-01

    A central inventory of energy-related environmental bibliographic and numeric data bases, graphics packages, integrated hardware/software systems, and models was established at Oak Ridge National Laboratory in an effort to make these resources at Department of Energy (DOE) laboratories better known and available to researchers and managers. This inventory will also serve to identify and avoid duplication among laboratories. The data were collected at each DOE laboratory, then sent to ORNL and merged into a single file. This document contains the data from the merged file. The data descriptions are organized under major data types: data bases, graphics packages, integrated hardware/software systems, and models. The data include descriptions of subject content, documentation, and contact persons. Also provided are computer data such as media on which the item is available, size of the item, computer on which the item executes, minimum hardware configuration necessary to execute the item, software language(s) and/or data base management system utilized, and character set used. For the models, additional data are provided to define the model more accurately. These data include a general statement of algorithms, computational methods, and theories used by the model; organizations currently using the model; the general application area of the model; sources of data utilized by the model; model validation methods, sensitivity analysis, and procedures; and general model classification. Data in this inventory will be available for on-line data retrieval on the DOE/RECON system

  3. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  4. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  5. Continuous time structural equation modeling with R package ctsem

    NARCIS (Netherlands)

    Driver, C.C.; Oud, J.H.L.; Völkle, M.C.

    2017-01-01

    We introduce ctsem, an R package for continuous time structural equation modeling of panel (N > 1) and time series (N = 1) data, using full information maximum likelihood. Most dynamic models (e.g., cross-lagged panel models) in the social and behavioural sciences are discrete time models. An

  6. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  7. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

    Directory of Open Access Journals (Sweden)

    Михаил Юрьевич Чернышов

    2013-12-01

    Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

  8. The khmer software package: enabling efficient nucleotide sequence analysis [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Michael R. Crusoe

    2015-09-01

    Full Text Available The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at https://github.com/dib-lab/khmer/.

  9. Multi-State Models for Panel Data: The msm Package for R

    Directory of Open Access Journals (Sweden)

    Christopher H. Jackson

    2011-01-01

    Full Text Available Panel data are observations of a continuous-time process at arbitrary times, for example, visits to a hospital to diagnose disease status. Multi-state models for such data are generally based on the Markov assumption. This article reviews the range of Markov models and their extensions which can be fitted to panel-observed data, and their implementation in the msm package for R. Transition intensities may vary between individuals, or with piecewise-constant time-dependent covariates, giving an inhomogeneous Markov model. Hidden Markov models can be used for multi-state processes which are misclassified or observed only through a noisy marker. The package is intended to be straightforward to use, flexible and comprehensively documented. Worked examples are given of the use of msm to model chronic disease progression and screening. Assessment of model fit, and potential future developments of the software, are also discussed.

  10. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... mul- tiple engineering disciplines. To this end, architectural specifications can serve as means for communication between different engineering disciplines. Such specifications aid in establishing the interface between the different com- ponents, belonging to different domains such as image...

  11. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  12. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  13. CORROSION OF LEAD SHIELDING IN MODEL 9975 PACKAGE

    Energy Technology Data Exchange (ETDEWEB)

    Subramanian, K

    2006-03-15

    Experiments were performed to determine the corrosion rate of lead when exposed to off-gas or degradation products of organic materials used in the model 9975 package.[1] The experiments were completed within the framework of a parametric test matrix with variables of organic configuration, temperature, humidity and the effect of durations of exposure on the corrosion of lead in the 9975 package. The room temperature vulcanizing (RTV) sealant was the most corrosive organic species in the testing, followed by the polyvinyl acetate (PVAc) glue. The Celotex{copyright} material uniquely induced measurable corrosion only in situations with condensed water, and to a much lesser extent than the PVAc glue and RTV. The coupons exhibited faster corrosion at higher temperatures than at room temperatures. There was a particularly pronounced effect of condensed water as the coupons exposed in the cells with condensed water exhibited much higher corrosion rates. In the 9975 package, the PVAc glue was determined to be the most aggressive due to it's proximity in the design. The condition considered most representative of the package conditions is that of the coupon exposed to the Celotex{copyright}/glue organic exposed in the ambient humidity conditions. The corrosion rate of 2 mpy measured in the laboratory experiments for this condition is considered to be a bounding condition to the 9975 package conditions when the laboratory results are extrapolated to actual package conditions, and is recommended as a conservative estimate for package performance calculations.

  14. A Model for Assessing the Liability of Seemingly Correct Software

    Science.gov (United States)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  15. Memoised Garbage Collection for Software Model Checking

    NARCIS (Netherlands)

    Nguyen, V.Y.; Ruys, T.C.; Kowalewski, S.; Philippou, A.

    Virtual machine based software model checkers like JPF and MoonWalker spend up to half of their veri��?cation time on garbage collection. This is no surprise as after nearly each transition the heap has to be cleaned from garbage. To improve this, this paper presents the Memoised Garbage Collection

  16. Documentation pckage for the RFID temperature monitoring system (Of Model 9977 packages at NTS).

    Energy Technology Data Exchange (ETDEWEB)

    Chen, K.; Tsai, H.; Decision and Information Sciences

    2009-02-20

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it can be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The

  17. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR

    International Nuclear Information System (INIS)

    Tejero, Roberto; Snyder, David; Mao, Binchen; Aramini, James M.; Montelione, Gaetano T.

    2013-01-01

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data

  18. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  19. wrv: An R Package for Groundwater Flow Model Construction, Wood River Valley Aquifer System, Idaho

    Science.gov (United States)

    Fisher, J. C.

    2014-12-01

    Groundwater models are one of the main tools used in the hydrogeological sciences to assess resources and to simulate possible effects from future water demands and changes in climate. The hydrological inputs to groundwater models can be numerous and can vary in both time and space. Difficulties associated with model construction are often related to extensive datasets and cumbersome data processing tasks. To mitigate these difficulties, a graphical user interface (GUI) is often employed to aid the input of data for creating models. Unfortunately, GUI software presents an obstacle to reproducibility, a cornerstone of research. The considerable effort required to document processing steps in a GUI program, and the rapid obsoleteness of these steps with subsequent versions of the software, has prompted modelers to explicitly write down processing steps as source code to make them 'easily' reproducible. This research describes the R package wrv, a collection of datasets and functions for pre- and post-processing the numerical groundwater flow model of the Wood River Valley aquifer system, south-central Idaho. R largely facilitates reproducible modeling with the package vignette; a document that is a combination of content and source code. The code is run when the vignette is built, and all data analysis output (such as figures and tables) is created on the fly and inserted into the final document. The wrv package includes two vignettes that explain and run steps that (1) create package datasets from raw data files located on a publicly accessible repository, and (2) create and run the groundwater flow model. MODFLOW-USG, the numerical groundwater model used in this study, is executed from the vignette, and model output is returned for exploratory analyses. The ability of R to perform all processing steps in a single workflow is attributed to its comprehensive list of features; that include geographic information system and time series functionality.

  20. Mass Transfer Model for a Breached Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    C. Hsu; J. McClure

    2004-07-26

    The degradation of waste packages, which are used for the disposal of spent nuclear fuel in the repository, can result in configurations that may increase the probability of criticality. A mass transfer model is developed for a breached waste package to account for the entrainment of insoluble particles. In combination with radionuclide decay, soluble advection, and colloidal transport, a complete mass balance of nuclides in the waste package becomes available. The entrainment equations are derived from dimensionless parameters such as drag coefficient and Reynolds number and based on the assumption that insoluble particles are subjected to buoyant force, gravitational force, and drag force only. Particle size distributions are utilized to calculate entrainment concentration along with geochemistry model abstraction to calculate soluble concentration, and colloid model abstraction to calculate colloid concentration and radionuclide sorption. Results are compared with base case geochemistry model, which only considers soluble advection loss.

  1. Mass Transfer Model for a Breached Waste Package

    International Nuclear Information System (INIS)

    Hsu, C.; McClure, J.

    2004-01-01

    The degradation of waste packages, which are used for the disposal of spent nuclear fuel in the repository, can result in configurations that may increase the probability of criticality. A mass transfer model is developed for a breached waste package to account for the entrainment of insoluble particles. In combination with radionuclide decay, soluble advection, and colloidal transport, a complete mass balance of nuclides in the waste package becomes available. The entrainment equations are derived from dimensionless parameters such as drag coefficient and Reynolds number and based on the assumption that insoluble particles are subjected to buoyant force, gravitational force, and drag force only. Particle size distributions are utilized to calculate entrainment concentration along with geochemistry model abstraction to calculate soluble concentration, and colloid model abstraction to calculate colloid concentration and radionuclide sorption. Results are compared with base case geochemistry model, which only considers soluble advection loss

  2. Parallelization of an existing high energy physics event reconstruction software package

    International Nuclear Information System (INIS)

    Schiefer, R.; Francis, D.

    1996-01-01

    Software parallelization allows an efficient use of available computing power to increase the performance of applications. In a case study the authors have investigated the parallelization of high energy physics event reconstruction software in terms of costs (effort, computing resource requirements), benefits (performance increase) and the feasibility of a systematic parallelization approach. Guidelines facilitating a parallel implementation are proposed for future software development

  3. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Science.gov (United States)

    Swales, Dustin J.; Pincus, Robert; Bodas-Salcedo, Alejandro

    2018-01-01

    The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP) gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  4. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Directory of Open Access Journals (Sweden)

    D. J. Swales

    2018-01-01

    Full Text Available The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  5. A Descriptive Evaluation of Software Sizing Models

    Science.gov (United States)

    1987-09-01

    compensate for a lack of understanding of a software job to be done. 1.3 REPORT OUTLINE The guiding principle for model selection for this paper was...MODEL SIZE ESTIMATES FOR THE CAiSS SENSITIVITY MODEL MODEL SLOC ESD 37,600+ SPQR 35,910 BYL 22,402 PRICE SZ 21,410 ASSET-R 11,943 SSM 11,700 ASSET-R...disk. ?. Date LS, De fault current date, Re quire ] - ,, ... perffr: an,- 1 ,’ e e st i ma t e. Quantitative inputs Note- Each of the nine quantitative

  6. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  7. Quantitative comparison and evaluation of two commercially available, two-dimensional electrophoresis image analysis software packages, Z3 and Melanie.

    Science.gov (United States)

    Raman, Babu; Cheung, Agnes; Marten, Mark R

    2002-07-01

    While a variety of software packages are available for analyzing two-dimensional electrophoresis (2-DE) gel images, no comparisons between these packages have been published, making it difficult for end users to determine which package would best meet their needs. The goal here was to develop a set of tests to quantitatively evaluate and then compare two software packages, Melanie 3.0 and Z3, in three of the fundamental steps involved in 2-DE image analysis: (i) spot detection, (ii) gel matching, and (iii) spot quantitation. To test spot detection capability, automatically detected protein spots were compared to manually counted, "real" protein spots. Spot matching efficiency was determined by comparing distorted (both geometrically and nongeometrically) gel images with undistorted original images, and quantitation tests were performed on artificial gels with spots of varying Gaussian volumes. In spot detection tests, Z3 performed better than Melanie 3.0 and required minimal user intervention to detect approximately 89% of the actual protein spots and relatively few extraneous spots. Results from gel matching tests depended on the type of image distortion used. For geometric distortions, Z3 performed better than Melanie 3.0, matching 99% of the spots, even for extreme distortions. For nongeometrical distortions, both Z3 and Melanie 3.0 required user intervention and performed comparably, matching 95% of the spots. In spot quantitation tests, both Z3 and Melanie 3.0 predicted spot volumes relatively well for spot ratios less than 1:6. For higher ratios, Melanie 3.0 did much better. In summary, results suggest Z3 requires less user intervention than Melanie 3.0, thus simplifying differential comparison of 2-DE gel images. Melanie 3.0, however, offers many more optional tools for image editing, spot detection, data reporting and statistical analysis than Z3. All image files used for these tests and updated information on the software are available on the internet

  8. A new comprehensive model and simulation package for fluidized bed spray granulation processes

    Energy Technology Data Exchange (ETDEWEB)

    Ihlow, M.; Drechsler, J.; Peglow, M.; Henneberg, M. [AVA GbRmbH, Steinfeldstrasse 5, D-39179 Barleben-Magdeburg (Germany); Moerl, L. [Universitaet Magdeburg, Institut fuer Apparate- und Umwelttechnik, Postfach 4120, D-39016 Magdeburg (Germany)

    2004-11-01

    The model introduced in this paper makes it possible to calculate the expected product and bed material particle size distribution, as well as the dynamic behavior of process state variables. It was integrated in the developed software package FBSim and it could be shown that even a complex process like the fluidized bed spray granulation can be simulated in its dynamic behavior, if existing models are extended and coupled. The implemented model achieves a coupled solution of the population balance and heat and mass transfer equations. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  9. Artificial Intelligence Software Engineering (AISE) model

    Science.gov (United States)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  10. PmagPy: Software package for paleomagnetic data analysis and a bridge to the Magnetics Information Consortium (MagIC) Database

    Science.gov (United States)

    Tauxe, L.; Shaar, R.; Jonestrask, L.; Swanson-Hysell, N. L.; Minnett, R.; Koppers, A. A. P.; Constable, C. G.; Jarboe, N.; Gaastra, K.; Fairchild, L.

    2016-06-01

    The Magnetics Information Consortium (MagIC) database provides an archive with a flexible data model for paleomagnetic and rock magnetic data. The PmagPy software package is a cross-platform and open-source set of tools written in Python for the analysis of paleomagnetic data that serves as one interface to MagIC, accommodating various levels of user expertise. PmagPy facilitates thorough documentation of sampling, measurements, data sets, visualization, and interpretation of paleomagnetic and rock magnetic experimental data. Although not the only route into the MagIC database, PmagPy makes preparation of newly published data sets for contribution to MagIC as a byproduct of normal data analysis and allows manipulation as well as reanalysis of data sets downloaded from MagIC with a single software package. The graphical user interface (GUI), Pmag GUI enables use of much of PmagPy's functionality, but the full capabilities of PmagPy extend well beyond that. Over 400 programs and functions can be called from the command line interface mode, or from within the interactive Jupyter notebooks. Use of PmagPy within a notebook allows for documentation of the workflow from the laboratory to the production of each published figure or data table, making research results fully reproducible. The PmagPy design and its development using GitHub accommodates extensions to its capabilities through development of new tools by the user community. Here we describe the PmagPy software package and illustrate the power of data discovery and reuse through a reanalysis of published paleointensity data which illustrates how the effectiveness of selection criteria can be tested.

  11. Development and Evaluation of an Open-Source Software Package “CGITA” for Quantifying Tumor Heterogeneity with Molecular Images

    Directory of Open Access Journals (Sweden)

    Yu-Hua Dean Fang

    2014-01-01

    Full Text Available Background. The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA toolbox, and provide it to the research community as a free, open-source project. Methods. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC cancer patients treated with definitive radiotherapies. Results. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC. Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.. Conclusions. CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.

  12. MODEL 9975 SHIPPING PACKAGE FABRICATION PROBLEMS AND SOLUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    May, C; Allen Smith, A

    2008-05-07

    The Model 9975 Shipping Package is the latest in a series (9965, 9968, etc.) of radioactive material shipping packages that have been the mainstay for shipping radioactive materials for several years. The double containment vessels are relatively simple designs using pipe and pipe cap in conjunction with the Chalfont closure to provide a leak-tight vessel. The fabrication appears simple in nature, but the history of fabrication tells us there are pitfalls in the different fabrication methods and sequences. This paper will review the problems that have arisen during fabrication and precautions that should be taken to meet specifications and tolerances. The problems and precautions can also be applied to the Models 9977 and 9978 Shipping Packages.

  13. SOFTWARE DEVELOPMENT MODEL FOR ETHNOBILINGUAL DICTIONARIES

    Directory of Open Access Journals (Sweden)

    Melchora Morales-Sánchez

    2010-09-01

    Full Text Available A software development integral model for a dictionary to store and retrieve textual, visual, and most important, incorporating the audio of oral language. Taking into account both the characterization of indigenous cultural reality and the technical aspects of software construction. Such model consists of the next phases: context description, lexicographic design, computer design and multimedia, construction and tests of the application. There isn´t doubt about the influence of the contact of Spanish language with the variety of languages spoken throughout Latin-America causing the most diverse and extensive communications. Causing that in the interior of communities are interested in preserving their language tongue for people to identify themselves with their own roots and transmit this legacy to the next generations. The model its design to develop dictionary software with factors that are certain in indigenous reality as they are: low budget, functioning in computers with limited resources and human resources with minimum capabilities. And is exemplified with the development of a Spanish-chatino dictionary spoken in the town of Santos Reyes Nopala, Oaxaca in the coast region of Mexico.

  14. SIMSAS - a window based software package for simulation and analysis of multiple small-angle scattering data

    International Nuclear Information System (INIS)

    Jayaswal, B.; Mazumder, S.

    1998-09-01

    Small-angle scattering data from strong scattering systems, e.g. porous materials, cannot be analysed invoking single scattering approximation as specimen needed to replicate the bulk matrix in essential properties are too thick to validate the approximation. The presence of multiple scattering is indicated by invalidity of the functional invariance property of the observed scattering profile with variation of sample thickness and/or wave length of the probing radiation. This article delineates how non accounting of multiple scattering affects the results of analysis and then how to correct the data for its effect. It deals with an algorithm to extract single scattering profile from small-angle scattering data affected by multiple scattering. The algorithm can process the scattering data and deduce single scattering profile in absolute scale. A software package, SIMSAS, is introduced for executing this inversion step. This package is useful both to simulate and to analyse multiple small-angle scattering data. (author)

  15. A Kinetic Model for Predicting the Relative Humidity in Modified Atmosphere Packaging and Its Application in Lentinula edodes Packages

    Directory of Open Access Journals (Sweden)

    Li-xin Lu

    2013-01-01

    Full Text Available Adjusting and controlling the relative humidity (RH inside package is crucial for ensuring the quality of modified atmosphere packaging (MAP of fresh produce. In this paper, an improved kinetic model for predicting the RH in MAP was developed. The model was based on heat exchange and gases mass transport phenomena across the package, gases heat convection inside the package, and mass and heat balances accounting for the respiration and transpiration behavior of fresh produce. Then the model was applied to predict the RH in MAP of fresh Lentinula edodes (one kind of Chinese mushroom. The model equations were solved numerically using Adams-Moulton method to predict the RH in model packages. In general, the model predictions agreed well with the experimental data, except that the model predictions were slightly high in the initial period. The effect of the initial gas composition on the RH in packages was notable. In MAP of lower oxygen and higher carbon dioxide concentrations, the ascending rate of the RH was reduced, and the RH inside packages was saturated slowly during storage. The influence of the initial gas composition on the temperature inside package was not much notable.

  16. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V ampersand V guideline packages and procedures. Volume 5

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V ampersand V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, open-quotes User's Manual.close quotes Three factors determine what V ampersand V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V Guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems

  17. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  18. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  19. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  20. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  1. Agile Maturity Model (AMM): A Software Process Improvement framework for Agile Software Development Practices

    OpenAIRE

    Chetankumar Patel; Muthu Ramachandran

    2009-01-01

    Agile software development methodologies have introduced best practices into software development. However we need to adopt and monitor those practices continuously to maximize its benefits. Our research has focused on adaptability, suitability and software maturity model called Agile Maturity Model (AMM) for agile software development environments. This paper introduces a process of adaptability assessment, suitability assessment, and improvement framework for assessing and improving agile b...

  2. Development and use of the computer software package for planning the 12 GHz broadcasting-satellite service at RARC '83

    Science.gov (United States)

    Bowen, R. R.; Brown, K. E.; Hothi, H. S.; Miller, E. F.

    1985-01-01

    The 1983 Regional Administrative Radio Conference (RARC '83) had mainly the objective to draw up a plan of detailed frequency assignments and orbital positions for the 12 GHz broadcasting-satellite service (BSS) in ITU Region 2 (the Western Hemisphere) and associated feeder links (earth-to-space) in the 17 GHz band. It was found that for RARC '83 new planning methods and procedures would be needed. The new requirements made it necessary to develop a new generation of planning software. Attention is given to the development of the computer programs to be used at the conference, the package of computer programs, and the use of the computer programs.

  3. Evaluation of three state-of-the-art metabolite prediction software packages (Meteor, MetaSite, and StarDrop) through independent and synergistic use.

    Science.gov (United States)

    T'jollyn, H; Boussery, K; Mortishire-Smith, R J; Coe, K; De Boeck, B; Van Bocxlaer, J F; Mannens, G

    2011-11-01

    The aim of this study was to evaluate three different metabolite prediction software packages (Meteor, MetaSite, and StarDrop) with respect to their ability to predict loci of metabolism and suggest relative proportions of metabolites. A chemically diverse test set of 22 compounds, for which in vivo human mass balance studies and metabolic schemes were available, was used as basis for the evaluation. Each software package was provided with structures of the parent compounds, and predicted metabolites were compared with experimentally determined human metabolites. The evaluation consisted of two parts. First, different settings within each software package were investigated and the software was evaluated using those settings determined to give the best prediction. Second, the three different packages were combined using the optimized settings to see whether a synergistic effect concerning the overall metabolism prediction could be established. The performance of the software was scored for both sensitivity and precision, taking into account the capabilities/limitations of the particular software. Varying results were obtained for the individual packages. Meteor showed a general tendency toward overprediction, and this led to a relatively low precision (∼35%) but high sensitivity (∼70%). MetaSite and StarDrop both exhibited a sensitivity and precision of ∼50%. By combining predictions obtained with the different packages, we found that increased precision can be obtained. We conclude that the state-of-the-art individual metabolite prediction software has many advantageous features but needs refinement to obtain acceptable prediction profiles. Synergistic use of different software packages could prove useful.

  4. Linear latent variable models: the lava-package

    DEFF Research Database (Denmark)

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features...... are implemented including robust standard errors for clustered correlated data, multigroup analyses, non-linear parameter constraints, inference with incomplete data, maximum likelihood estimation with censored and binary observations, and instrumental variable estimators. In addition an extensive simulation...

  5. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    Science.gov (United States)

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All

  6. The arison data acquisition and elaboration software package running on Hewlett-Packard minicomputer

    International Nuclear Information System (INIS)

    Diamantidis, Z.

    1987-01-01

    In this article, a data acquisition and elaboration system is described consisting in a PCM data acquisition and a spectrum analyser system and their data elaboration package for reactor safety or other general purposes. Measurements on temperature, noise fluctuations of temperature or other noise analysis dynamics, pressure, etc. Time series and their conversion in engineering units and their statistical and frequency analysis is provided

  7. The R Package bgmm : Mixture Modeling with Uncertain Knowledge

    Directory of Open Access Journals (Sweden)

    Przemys law Biecek

    2012-04-01

    Full Text Available Classical supervised learning enjoys the luxury of accessing the true known labels for the observations in a modeled dataset. Real life, however, poses an abundance of problems, where the labels are only partially defined, i.e., are uncertain and given only for a subsetof observations. Such partial labels can occur regardless of the knowledge source. For example, an experimental assessment of labels may have limited capacity and is prone to measurement errors. Also expert knowledge is often restricted to a specialized area and is thus unlikely to provide trustworthy labels for all observations in the dataset. Partially supervised mixture modeling is able to process such sparse and imprecise input. Here, we present an R package calledbgmm, which implements two partially supervised mixture modeling methods: soft-label and belief-based modeling. For completeness, we equipped the package also with the functionality of unsupervised, semi- and fully supervised mixture modeling. On real data we present the usage of bgmm for basic model-fitting in all modeling variants. The package can be applied also to selection of the best-fitting from a set of models with different component numbers or constraints on their structures. This functionality is presented on an artificial dataset, which can be simulated in bgmm from a distribution defined by a given model.

  8. Verification and validation of the SAPHIRE Version 4.0 PRA software package

    International Nuclear Information System (INIS)

    Bolander, T.W.; Calley, M.B.; Capps, E.L.

    1994-02-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE). SAPHIRE is a set of four computer programs that the Nuclear Regulatory Commission (NRC) developed to perform probabilistic risk assessments (PRAs). These programs allow an analyst to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs included in this set are Integrated Reliability and Risk Analysis System (IRRAS), System Analysis and Risk Assessment (SARA), Models and Results Database (MAR-D), and Fault Tree/Event Tree/Piping and Instrumentation Diagram (FEP) graphical editor. The V ampersand V steps included a V ampersand V plan to describe the process and criteria by which the V ampersand V would be performed; a software requirements documentation review to determine the correctness, completeness, and traceability of the requirements; a user survey to determine the usefulness of the user documentation, identification and testing of vital and non-vital features, and documentation of the test results

  9. The consequences of a new software package for the quantification of gated-SPECT myocardial perfusion studies

    International Nuclear Information System (INIS)

    Veen, Berlinda J. van der; Dibbets-Schneider, Petra; Stokkel, Marcel P.M.; Scholte, Arthur J.

    2010-01-01

    Semiquantitative analysis of myocardial perfusion scintigraphy (MPS) has reduced inter- and intraobserver variability, and enables researchers to compare parameters in the same patient over time, or between groups of patients. There are several software packages available that are designed to process MPS data and quantify parameters. In this study the performances of two systems, quantitative gated SPECT (QGS) and 4D-MSPECT, in the processing of clinical patient data and phantom data were compared. The clinical MPS data of 148 consecutive patients were analysed using QGS and 4D-MSPECT to determine the end-diastolic volume, end-systolic volume and left ventricular ejection fraction. Patients were divided into groups based on gender, body mass index, heart size, stressor type and defect type. The AGATE dynamic heart phantom was used to provide reference values for the left ventricular ejection fraction. Although the correlations were excellent (correlation coefficients 0.886 to 0.980) for all parameters, significant differences (p < 0.001) were found between the systems. Bland-Altman plots indicated that 4D-MSPECT provided overall higher values of all parameters than QGS. These differences between the systems were not significant in patients with a small heart (end-diastolic volume <70 ml). Other clinical factors had no direct influence on the relationship. Additionally, the phantom data indicated good linear responses of both systems. The discrepancies between these software packages were clinically relevant, and influenced by heart size. The possibility of such discrepancies should be taken into account when a new quantitative software system is introduced, or when multiple software systems are used in the same institution. (orig.)

  10. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  11. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  12. Particle Data Management Software for 3DParticle Tracking Velocimetry and Related Applications – The Flowtracks Package

    Directory of Open Access Journals (Sweden)

    Yosef Meller

    2016-06-01

    Full Text Available The Particle Tracking Velocimetry (PTV community employs several formats of particle information such as position and velocity as function of time, i.e. trajectory data, as a result of diverging needs unmet by existing formats, and a number of different, mostly home-grown, codes for handling the data. Flowtracks is a Python package that provides a single code base for accessing different formats as a database, i.e. storing data and programmatically manipulating them using format-agnostic data structures. Furthermore, it offers an HDF5-based format that is fast and extensible, obviating the need for other formats. The package may be obtained from https://github.com/OpenPTV/postptv and used as-is by many fluid-dynamics labs, or with minor extensions adhering to a common interface, by researchers from other fields, such as biology and population tracking.

  13. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing.

    Directory of Open Access Journals (Sweden)

    Chiara Grasso

    Full Text Available Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis.Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results.We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1 by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36 and DNA from blood fractions of healthy people (DD study, N = 28, respectively.We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites.The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.

  14. Program SPACECAP: software for estimating animal density using spatially explicit capture-recapture models

    Science.gov (United States)

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas

    2012-01-01

    1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.

  15. mixtools: An R Package for Analyzing Mixture Models

    Directory of Open Access Journals (Sweden)

    Tatiana Benaglia

    2009-10-01

    Full Text Available The mixtools package for R provides a set of functions for analyzing a variety of finite mixture models. These functions include both traditional methods, such as EM algorithms for univariate and multivariate normal mixtures, and newer methods that reflect some recent research in finite mixture models. In the latter category, mixtools provides algorithms for estimating parameters in a wide range of different mixture-of-regression contexts, in multinomial mixtures such as those arising from discretizing continuous multivariate data, in nonparametric situations where the multivariate component densities are completely unspecified, and in semiparametric situations such as a univariate location mixture of symmetric but otherwise unspecified densities. Many of the algorithms of the mixtools package are EM algorithms or are based on EM-like ideas, so this article includes an overview of EM algorithms for finite mixture models.

  16. Area of ischemia assessed by physicians and software packages from myocardial perfusion scintigrams

    DEFF Research Database (Denmark)

    Edenbrandt, L.; Hoglund, P.; Frantz, S.

    2014-01-01

    medicine delineated the extent of the ischemic defects. After at least two weeks, they delineated the defects again, and were this time provided a suggestion of the defect delineation by EXINI Heart(TM) (EXINI). Summed difference scores and ischemic extent values were obtained from four software programs......Background: The European Society of Cardiology recommends that patients with > 10% area of ischemia should receive revascularization. We investigated inter-observer variability for the extent of ischemic defects reported by different physicians and by different software tools, and if inter....... Results: The median extent values obtained from the 11 physicians varied between 8% and 34%, and between 9% and 16% for the software programs. For all 25 patients, mean extent obtained from EXINI was 17.0% (+/- standard deviation (SD) 14.6%). Mean extent for physicians was 22.6% (+/- 15.6%) for the first...

  17. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, F. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain)]. E-mail: fimerall@ull.es; Gonzalez-Manrique, S. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Karlsson, L. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Hernandez-Armas, J. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Aparicio, A. [Instituto de Astrofisica de Canarias, 38200 La Laguna, Tenerife (Spain); Departamento de Astrofisica, Universidad de La Laguna. Avenida. Astrofisico Francisco Sanchez s/n, 38071 La Laguna, Tenerife (Spain)

    2007-03-15

    Makrofol detectors are commonly used for long-term radon ({sup 222}Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm{sup -3}htrack{sup -1}cm{sup 2}, has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm{sup -3} and, in two of them, above 400Bqm{sup -3}. Further studies should be performed at those schools following the European Union recommendations about radon concentrations in

  18. airGRteaching: an R-package designed for teaching hydrology with lumped hydrological models

    Science.gov (United States)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Andréassian, Vazken; Brigode, Pierre

    2017-04-01

    discharges, which are updated immediately (a calibration only needs a couple of seconds or less, a simulation is almost immediate). In addition, time series of internal variables, live-visualisation of internal variables evolution and performance statistics are provided. This interface allows for hands-on exercises that can include for instance the analysis by students of: - The effects of each parameter and model components on simulated discharge - The effects of objective functions based on high flows- or low flows-focused criteria on simulated discharge - The seasonality of the model components. References Winston Chang, Joe Cheng, JJ Allaire, Yihui Xie and Jonathan McPherson (2016). shiny: Web Application Framework for R. R package version 0.13.2. https://CRAN.R-project.org/package=shiny Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. Olivier Delaigue and Laurent Coron (2016). airGRteaching: Tools to simplify the use of the airGR hydrological package by students. R package version 0.0.1. https://webgr.irstea.fr/airGR/?lang=en R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  19. Development of a software for the curimeter model cdn102

    International Nuclear Information System (INIS)

    Dotres Llera, Armando

    2001-01-01

    The characteristics of the software for the Curimeter Model CD-N102 developed at CEADEN are presented. The software consists of two main parts: a basic software for the electrometer block and an application software for a P C. The basic software is totally independent of the Pc and performs all the basic functions of the process of measurement. The application software is optional and offers a friendlier interface and additional options to the user. Among these is the possibility to keep a statistical record of the measurements in a database, to create labels and to introduce new isotopes and calibrate them. A more detailed explanation of both software is given

  20. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  1. SWISTRACK - AN OPEN SOURCE, SOFTWARE PACKAGE APPLICABLE TO TRACKING OF FISH LOCOMOTION AND BEHAVIOUR

    DEFF Research Database (Denmark)

    Steffensen, John Fleng

    2010-01-01

    benefits of the software relate to its open source code. Users competent in C++ can readily modify and create their own tracking protocols suited to their own custom designed experimental setups. These benefits will be demonstrated with the presentation of supplementary visual mtormation....

  2. STARE, a sonar data post-processing and visualisation software package

    NARCIS (Netherlands)

    Theije, P.A.M. de

    2005-01-01

    This paper presents STARE, a sonar-data post-processing and visualisation toot developed by TNO and written in Matlab 6.5. The software takes into account all available acoustic and non-acoustic data (GPS, radar, source/receiver position, time latencies, etc.). The latter can be used to get optimal

  3. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  4. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data.

    Science.gov (United States)

    Hebart, Martin N; Görgen, Kai; Haynes, John-Dylan

    2014-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns.

  5. The Decoding Toolbox (TDT: A versatile software package for multivariate analyses of functional imaging data

    Directory of Open Access Journals (Sweden)

    Martin Nikolai Hebart

    2015-01-01

    Full Text Available The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns.

  6. SAHM:VisTrails (Software for Assisted Habitat Modeling for VisTrails): training course

    Science.gov (United States)

    Holcombe, Tracy

    2014-01-01

    VisTrails is an open-source management and scientific workflow system designed to integrate the best of both scientific workflow and scientific visualization systems. Developers can extend the functionality of the VisTrails system by creating custom modules for bundled VisTrails packages. The Invasive Species Science Branch of the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) and the U.S. Department of the Interior’s North Central Climate Science Center have teamed up to develop and implement such a module—the Software for Assisted Habitat Modeling (SAHM). SAHM expedites habitat modeling and helps maintain a record of the various input data, the steps before and after processing, and the modeling options incorporated in the construction of an ecological response model. There are four main advantages to using the SAHM:VisTrails combined package for species distribution modeling: (1) formalization and tractable recording of the entire modeling process; (2) easier collaboration through a common modeling framework; (3) a user-friendly graphical interface to manage file input, model runs, and output; and (4) extensibility to incorporate future and additional modeling routines and tools. In order to meet increased interest in the SAHM:VisTrails package, the FORT offers a training course twice a year. The course includes a combination of lecture, hands-on work, and discussion. Please join us and other ecological modelers to learn the capabilities of the SAHM:VisTrails package.

  7. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  8. spBayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models.

    Science.gov (United States)

    Finley, Andrew O; Banerjee, Sudipto; Carlin, Bradley P

    2007-04-01

    Scientists and investigators in such diverse fields as geological and environmental sciences, ecology, forestry, disease mapping, and economics often encounter spatially referenced data collected over a fixed set of locations with coordinates (latitude-longitude, Easting-Northing etc.) in a region of study. Such point-referenced or geostatistical data are often best analyzed with Bayesian hierarchical models. Unfortunately, fitting such models involves computationally intensive Markov chain Monte Carlo (MCMC) methods whose efficiency depends upon the specific problem at hand. This requires extensive coding on the part of the user and the situation is not helped by the lack of available software for such algorithms. Here, we introduce a statistical software package, spBayes, built upon the R statistical computing platform that implements a generalized template encompassing a wide variety of Gaussian spatial process models for univariate as well as multivariate point-referenced data. We discuss the algorithms behind our package and illustrate its use with a synthetic and real data example.

  9. spBayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models

    Directory of Open Access Journals (Sweden)

    Andrew O. Finley

    2007-04-01

    Full Text Available Scientists and investigators in such diverse fields as geological and environmental sciences, ecology, forestry, disease mapping, and economics often encounter spatially referenced data collected over a fixed set of locations with coordinates (latitude–longitude, Easting–Northing etc. in a region of study. Such point-referenced or geostatistical data are often best analyzed with Bayesian hierarchical models. Unfortunately, fitting such models involves computationally intensive Markov chain Monte Carlo (MCMC methods whose efficiency depends upon the specific problem at hand. This requires extensive coding on the part of the user and the situation is not helped by the lack of available software for such algorithms. Here, we introduce a statistical software package, spBayes, built upon the R statistical computing platform that implements a generalized template encompassing a wide variety of Gaussian spatial process models for univariate as well as multivariate point-referenced data. We discuss the algorithms behind our package and illustrate its use with a synthetic and real data example.

  10. Selection of software for mechanical engineering undergraduates

    Energy Technology Data Exchange (ETDEWEB)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S., E-mail: ablicblau@swin.edu.au [Swinburne University of Technology, Faculty of Science Engineering and Technology, PO Box 218 Hawthorn, Victoria, Australia, 3122 (Australia)

    2016-07-12

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  11. Selection of software for mechanical engineering undergraduates

    International Nuclear Information System (INIS)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S.

    2016-01-01

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  12. FishingCNV: a graphical software package for detecting rare copy number variations in exome-sequencing data.

    Science.gov (United States)

    Shi, Yuhao; Majewski, Jacek

    2013-06-01

    Rare copy number variations (CNVs) are frequent causes of genetic diseases. We developed a graphical software package based on a novel approach that can consistently identify CNVs of all types (homozygous deletions, heterozygous deletions, heterozygous duplications) from exome-sequencing data without the need of a paired control. The algorithm compares coverage depth in a test sample against a background distribution of control samples and uses principal component analysis to remove batch effects. It is user friendly and can be run on a personal computer. The main scripts are implemented in R (2.15), and the GUI is created using Java 1.6. It can be run on all major operating systems. A non-GUI version for pipeline implementation is also available. The program is freely available online: https://sourceforge.net/projects/fishingcnv/ Supplementary data are available at Bioinformatics online.

  13. dglars: An R Package to Estimate Sparse Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Luigi Augugliaro

    2014-09-01

    Full Text Available dglars is a publicly available R package that implements the method proposed in Augugliaro, Mineo, and Wit (2013, developed to study the sparse structure of a generalized linear model. This method, called dgLARS, is based on a differential geometrical extension of the least angle regression method proposed in Efron, Hastie, Johnstone, and Tibshirani (2004. The core of the dglars package consists of two algorithms implemented in Fortran 90 to efficiently compute the solution curve: a predictor-corrector algorithm, proposed in Augugliaro et al. (2013, and a cyclic coordinate descent algorithm, proposed in Augugliaro, Mineo, and Wit (2012. The latter algorithm, as shown here, is significantly faster than the predictor-corrector algorithm. For comparison purposes, we have implemented both algorithms.

  14. “DETECTION ARTIFACTS” SOFTWARE PACKAGE: FUNCTIONAL CAPABILITIES AND PROSPECTS OF USING (ON THE EXAMPLE OF GEOARCHEOLOGICAL RESEARCH

    Directory of Open Access Journals (Sweden)

    Ye. P. Krupochkin

    2017-01-01

    Full Text Available Mathematical and scientific methods are highly significant in modern geoarcheological study. They contribute to the development of new computer technologies and their implementing in geoarcheological research in particular, decoding and photogrammetric processing of space images.The article focuses on the “Detection Artifacts”software package designed for thematic aerospace image decoding which is aimed at making the search automatic for various archeological sites, both natural and artificially created ones. The main attention is drawn to decoding of archeological sites using methods of morphological analysis and indicative decoding.Its work is based on two groups of methods of image computer processing: 1 an image enhancement method which is carried out with the help of spatial frequency filtration, and 2 a method of morphometric analysis. The methods of spatial frequency filtration can be used to solve two problems: information noise minimization and edge enhancement. To achieve the best results using the methods of spatial frequency filtration it is necessary to have all the information of relevance to the objects of searching.Searching for various archeological sites is not only photogrammetric task. In fact, this problem can be solved in the sphere of photogrammetry with the application of aerospace and computer methods. The authors stress the idea in order to avoid terminology ambiguity and confusion when describing the essence of the methods and processes. It should be noted that the work with the images must be executed in a strict sequence. First and foremost, photogrammetric processing – atmospheric correction, geometric adjustment, conversion and geo targeting should be implemented. And only after that one can proceed to decoding the information.When creating the software package a modular structure was applied that favorably affected the tasks being solved and corresponded to the conception of search for archaeological objects

  15. Radiological modeling software for underground uranium mines

    International Nuclear Information System (INIS)

    Bjorndal, B.; Moridi, R.

    1999-01-01

    The Canadian Institute for Radiation Safety (CAIRS) has developed computer simulation software for modeling radiological parameters in underground uranium mines. The computer program, called 3d RAD, allows radiation protection professionals and mine ventilation engineers to quickly simulate radon and radon progeny activity concentrations and potential alpha energy concentrations in complex mine networks. The simulation component of 3d RAD, called RSOLVER, is an adaptation of an existing modeling program called VENTRAD, originally developed at Queen's University, Ontario. Based on user defined radiation source terms and network physical properties, radiological parameters in the network are calculated iteratively by solving Bateman's Equations in differential form. The 3d RAD user interface was designed in cooperation with the Canada Centre for Mineral and Energy Technology (CANMET) to improve program functionality and to make 3d RAD compatible with the CANMET ventilation simulation program, 3d CANVENT. The 3d RAD program was tested using physical data collected in Canadian uranium mines. 3d RAD predictions were found to agree well with theoretical calculations and simulation results obtained from other modeling programs such as VENTRAD. Agreement with measured radon and radon progeny levels was also observed. However, the level of agreement was found to depend heavily on the precision of source term data, and on the measurement protocol used to collect radon and radon progeny levels for comparison with the simulation results. The design and development of 3d RAD was carried out under contract with the Saskatchewan government

  16. CARBayes: An R Package for Bayesian Spatial Modeling with Conditional Autoregressive Priors

    Directory of Open Access Journals (Sweden)

    Duncan Lee

    2013-11-01

    Full Text Available Conditional autoregressive models are commonly used to represent spatial autocorrelation in data relating to a set of non-overlapping areal units, which arise in a wide variety of applications including agriculture, education, epidemiology and image analysis. Such models are typically specified in a hierarchical Bayesian framework, with inference based on Markov chain Monte Carlo (MCMC simulation. The most widely used software to fit such models is WinBUGS or OpenBUGS, but in this paper we introduce the R package CARBayes. The main advantage of CARBayes compared with the BUGS software is its ease of use, because: (1 the spatial adjacency information is easy to specify as a binary neighbourhood matrix; and (2 given the neighbourhood matrix the models can be implemented by a single function call in R. This paper outlines the general class of Bayesian hierarchical models that can be implemented in the CARBayes software, describes their implementation via MCMC simulation techniques, and illustrates their use with two worked examples in the fields of house price analysis and disease mapping.

  17. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business....... This thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  18. MedXViewer: an extensible web-enabled software package for medical imaging

    Science.gov (United States)

    Looney, P. T.; Young, K. C.; Mackenzie, Alistair; Halling-Brown, Mark D.

    2014-03-01

    MedXViewer (Medical eXtensible Viewer) is an application designed to allow workstation-independent, PACS-less viewing and interaction with anonymised medical images (e.g. observer studies). The application was initially implemented for use in digital mammography and tomosynthesis but the flexible software design allows it to be easily extended to other imaging modalities. Regions of interest can be identified by a user and any associated information about a mark, an image or a study can be added. The questions and settings can be easily configured depending on the need of the research allowing both ROC and FROC studies to be performed. The extensible nature of the design allows for other functionality and hanging protocols to be available for each study. Panning, windowing, zooming and moving through slices are all available while modality-specific features can be easily enabled e.g. quadrant zooming in mammographic studies. MedXViewer can integrate with a web-based image database allowing results and images to be stored centrally. The software and images can be downloaded remotely from this centralised data-store. Alternatively, the software can run without a network connection where the images and results can be encrypted and stored locally on a machine or external drive. Due to the advanced workstation-style functionality, the simple deployment on heterogeneous systems over the internet without a requirement for administrative access and the ability to utilise a centralised database, MedXViewer has been used for running remote paper-less observer studies and is capable of providing a training infrastructure and co-ordinating remote collaborative viewing sessions (e.g. cancer reviews, interesting cases).

  19. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  20. Modeling ICF With RAGE, BHR, And The New Laser Package

    Science.gov (United States)

    Cliche, Dylan; Welser-Sherrill, Leslie; Haines, Brian; Mancini, Roberto

    2017-10-01

    Inertial Confinement Fusion (ICF) is one method used to obtain thermonuclear burn through the either direct or indirect ablation of a millimeter-scale capsule with several lasers. Although progress has been made in theory, experiment, and diagnostics, the community has yet to reach ignition. A way of investigating this is through the use of high performance computer simulations of the implosion. RAGE is an advanced 1D, 2D, and 3D radiation adaptive grid Eulerian code used to simulate hydrodynamics of a system. Due to the unstable nature of two unequal densities accelerating into one another, it is important to include a turbulence model. BHR is a turbulence model which uses Reynolds-averaged Navier-Stokes (RANS) equations to model the mixing that occurs between the shell and fusion fuel material. Until recently, it was still difficult to model direct drive experiments because there was no laser energy deposition model in RAGE. Recently, a new laser energy deposition model has been implemented using the same ray tracing method as the Mazinisin laser package used at the OMEGA laser facility at the Laboratory for Laser Energetics (LLE) in Rochester, New York. Using the new laser package along with BHR for mixing allows us to more accurately simulate ICF implosions and obtain spatially and temporally resolved information (e.g. position, temperature, density, and mix concentrations) to give insight into what is happening inside the implosion.

  1. Example of software configuration management model

    International Nuclear Information System (INIS)

    Roth, P.

    2006-01-01

    Software configuration management is the mechanism used to track and control software changes and may include the following actions: A tracking system should be established for any changes made to the existing software configuration. Requirement of the configuration management system are the following: - Backup the different software configuration; - Record the details (the date, the subject, the filenames, the supporting documents, the tests, ...) of the changes introduced in the new configuration; - Document all the differences between the different versions. Configuration management allows simultaneous exploitation of one specific version and development of the next version. Minor correction can be perform in the current exploitation version

  2. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...

  3. Informed-Proteomics: open-source software package for top-down proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher; Zhou, Mowei; Mendoza, Joshua; Fujimoto, Grant M.; Gibbons, Bryson C.; Shaw, Jared B.; Shen, Yufeng; Shukla, Anil K.; Moore, Ronald J.; Liu, Tao; Petyuk, Vladislav A.; Tolić, Nikola; Paša-Tolić, Ljiljana; Smith, Richard D.; Payne, Samuel H.; Kim, Sangtae

    2017-08-07

    Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent need to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.

  4. Traceability for Model Driven, Software Product Line Engineering

    NARCIS (Netherlands)

    Anquetil, N.; Grammel, B.; Galvao, I.; Noppen, J.A.R.; Shakil Khan, S.; Arboleda, H.; Rashid, A.; Garcia, A.

    Traceability is an important challenge for software organizations. This is true for traditional software development and even more so in new approaches that introduce more variety of artefacts such as Model Driven development or Software Product Lines. In this paper we look at some aspect of the

  5. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    In this paper, we discuss the methodologies adopted previously in software cost estimation using the COnstructive COst MOdels (COCOMOs). From our analysis, COCOMOs produce very high software development efforts, which eventually produce high software development costs. Consequently, we propose its extension, ...

  6. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools.

    Science.gov (United States)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier

    2017-11-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2  = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2  = 0.4, S e  = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. DYNSTALL: Subroutine package with a dynamic stall model

    Energy Technology Data Exchange (ETDEWEB)

    Bjoerck, Anders [Aeronautical Research Inst. of Sweden, Bromma (Sweden)

    2001-03-01

    A subroutine package, called DYNSTALL, for the calculation of 2D unsteady airfoil aerodynamics is described. The subroutines are written in FORTRAN. DYNSTALL is basically an implementation of the Beddoes-Leishman dynamic stall model. This model is a semi-empirical model for dynamic stall. It includes, however, also models for attached flow unsteady aerodynamics. It is complete in the sense that it treats attached flow as well as separated flow. Semi-empirical means that the model relies on empirically determined constants. Semi because the constants are constants in equations with some physical interpretation. It requires the input of 2D airfoil aerodynamic data via tables as function of angle of attack. The method is intended for use in an aeroelastic code with the aerodynamics solved by blade/element method. DYNSTALL was written to work for any 2D angles of attack relative to the airfoil, e.g. flow from the rear of an airfoil.

  8. SpcAudace: Spectroscopic processing and analysis package of Audela software

    Science.gov (United States)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  9. On the possibility of using commercial software packages for thermoluminescence glow curve deconvolution analysis

    International Nuclear Information System (INIS)

    Pagonis, V.; Kitis, G.

    2002-01-01

    This paper explores the possibility of using commercial software for thermoluminescence glow curve deconvolution (GCD) analysis. The program PEAKFIT has been used to perform GCD analysis of complex glow curves of quartz and dosimetric materials. First-order TL peaks were represented successfully using the Weibull distribution function. Second-order and general-order TL peaks were represented accurately by using the Logistic asymmetric functions with varying symmetry parameters. Analytical expressions were derived for determining the energy E from the parameters of the Logistic asymmetric functions. The accuracy of these analytical expressions for E was tested for a wide variety of kinetic parameters and was found to be comparable to the commonly used expressions in the TL literature. The effectiveness of fit the analytical functions used here was tested using the figure of merit and was found to be comparable to the accuracy of recently published GCD expressions for first- and general-order kinetics. (author)

  10. The GEMPAK Barnes interactive objective map analysis scheme. [General Meteorological Software Package

    Science.gov (United States)

    Koch, S. E.; Kocin, P. J.; Desjardins, M.

    1983-01-01

    The analysis scheme and meteorological applications of the GEMPAK data analysis and display software system developed by NASA are described. The program was devised to permit objective, versatile, and practical analysis of satellite meteorological data using a minicomputer and a display system with graphics capability. A data area can be selected within the data file for the globe, and data-sparse regions can be avoided. Distances between observations and the nearest observation points are calculated in order to avoid errors when determining synoptic weather conditions. The Barnes (1973) successive correction method is employed to restore the amplitude of small yet resolvable wavelengths suppressed in an initial filtering pass. The rms deviation is then calculated in relation to available measured data. Examples are provided of treatment of VISSR data from the GOES satellite and a study of the impact of incorrect cloud height data on synoptic weather field analysis.

  11. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  12. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    Science.gov (United States)

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  13. A multi-layered software architecture model for building software solutions in an urbanized information system

    Directory of Open Access Journals (Sweden)

    Sana Guetat

    2013-01-01

    Full Text Available The concept of Information Systems urbanization has been proposed since the late 1990’s in order to help organizations building agile information systems. Nevertheless, despite the advantages of this concept, it remains too descriptive and presents many weaknesses. In particular, there is a lack of useful architecture models dedicated to defining software solutions compliant with information systems urbanization principles and rules. Moreover, well-known software architecture models do not provide sufficient resources to address the requirements and constraints of urbanized information systems. In this paper, we draw on the “information city” framework to propose a model of software architecture - called the 5+1 Software Architecture Model - which is compliant with information systems urbanization principles and helps organizations building urbanized software solutions. This framework improves the well-established software architecture models and allows the integration of new architectural paradigms. Furthermore, the proposed model contributes to the implementation of information systems urbanization in several ways. On the one hand, this model devotes a specific layer to applications integration and software reuse. On the other hand, it contributes to the information system agility and scalability due to its conformity to the separation of concerns principle.

  14. Migration modelling as a tool for quality assurance of food packaging.

    Science.gov (United States)

    Brandsch, J; Mercea, P; Rüter, M; Tosa, V; Piringer, O

    2002-01-01

    The current potential for the use of migration modelling for studying polyolefin packaging materials (low- and high-density polyethylene and polypropylene) is summarized and demonstrated with practical examples. For these polymers, an upper limit of migration into foodstuffs can be predicted with a high degree of statistical confidence. The only analytical information needed for modelling in such cases is the initial concentration of the migrant in the polymer matrix. For polyolefins of unknown origin or newly developed materials with new properties, a quick experimental method is described for obtaining the characteristic matrix parameter needed for migration modelling. For easy handling of both the experimental results and the diffusion model, user-friendly software has been developed. An additional aim of the described method is the determination of the migrant partition between polymer and food or food simulant and the specific contribution of the migrant molecular structure on the diffusion coefficient. For migration modelling of packaging materials with multilayer structures, a numerical solution of the diffusion equation is described. This procedure has been also applied for modelling the migration into solid or high viscous foodstuffs.

  15. Model-driven and software product line engineering

    CERN Document Server

    Royer, Jean-Claude

    2013-01-01

    Many approaches to creating Software Product Lines have emerged that are based on Model-Driven Engineering. This book introduces both Software Product Lines and Model-Driven Engineering, which have separate success stories in industry, and focuses on the practical combination of them. It describes the challenges and benefits of merging these two software development trends and provides the reader with a novel approach and practical mechanisms to improve software development productivity.The book is aimed at engineers and students who wish to understand and apply software product lines

  16. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  17. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  18. SPOTting Model Parameters Using a Ready-Made Python Package.

    Directory of Open Access Journals (Sweden)

    Tobias Houska

    Full Text Available The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool, an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI. We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  19. Thermal modeling of nuclear waste package designs for disposal in tuff

    International Nuclear Information System (INIS)

    Hockman, J.N.; O'Neal, W.C.

    1983-09-01

    Lawrence Livermore National Laboratory is involved in the design and testing of high-level nuclear waste packages. Many of the aspects of waste package design and testing (e.g., corrosion and leaching) depend in part on the temperature history of the emplaced packages. This paper discusses thermal modeling and analysis of various emplaced waste package conceptual designs including the models used, the assumptions and approximations made, and the results obtained. 6 references, 6 figures, 3 tables

  20. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools

    DEFF Research Database (Denmark)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei

    2017-01-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioriti......Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening...... and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step......-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications...

  1. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  2. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    OpenAIRE

    H. Yanagi; H. Yanagi; H. Chikatsu

    2016-01-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algori...

  3. The production-distribution problem with order acceptance and package delivery: models and algorithm

    Directory of Open Access Journals (Sweden)

    Khalili Majid

    2016-01-01

    Full Text Available The production planning and distribution are among the most important decisions in the supply chain. Classically, in this problem, it is assumed that all orders have to produced and separately delivered; while, in practice, an order may be rejected if the cost that it brings to the supply chain exceeds its revenue. Moreover, orders can be delivered in a batch to reduce the related costs. This paper considers the production planning and distribution problem with order acceptance and package delivery to maximize the profit. At first, a new mathematical model based on mixed integer linear programming is developed. Using commercial optimization software, the model can optimally solve small or even medium sized instances. For large instances, a solution method, based on imperialist competitive algorithms, is also proposed. Using numerical experiments, the proposed model and algorithm are evaluated.

  4. Continuous Time Structural Equation Modeling with R Package ctsem

    Directory of Open Access Journals (Sweden)

    Charles C. Driver

    2017-04-01

    Full Text Available We introduce ctsem, an R package for continuous time structural equation modeling of panel (N > 1 and time series (N = 1 data, using full information maximum likelihood. Most dynamic models (e.g., cross-lagged panel models in the social and behavioural sciences are discrete time models. An assumption of discrete time models is that time intervals between measurements are equal, and that all subjects were assessed at the same intervals. Violations of this assumption are often ignored due to the difficulty of accounting for varying time intervals, therefore parameter estimates can be biased and the time course of effects becomes ambiguous. By using stochastic differential equations to estimate an underlying continuous process, continuous time models allow for any pattern of measurement occasions. By interfacing to OpenMx, ctsem combines the flexible specification of structural equation models with the enhanced data gathering opportunities and improved estimation of continuous time models. ctsem can estimate relationships over time for multiple latent processes, measured by multiple noisy indicators with varying time intervals between observations. Within and between effects are estimated simultaneously by modeling both observed covariates and unobserved heterogeneity. Exogenous shocks with different shapes, group differences, higher order diffusion effects and oscillating processes can all be simply modeled. We first introduce and define continuous time models, then show how to specify and estimate a range of continuous time models using ctsem.

  5. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and v...... data framing protocol....

  6. Extracting software static defect models using data mining

    Directory of Open Access Journals (Sweden)

    Ahmed H. Yousef

    2015-03-01

    Full Text Available Large software projects are subject to quality risks of having defective modules that will cause failures during the software execution. Several software repositories contain source code of large projects that are composed of many modules. These software repositories include data for the software metrics of these modules and the defective state of each module. In this paper, a data mining approach is used to show the attributes that predict the defective state of software modules. Software solution architecture is proposed to convert the extracted knowledge into data mining models that can be integrated with the current software project metrics and bugs data in order to enhance the prediction. The results show better prediction capabilities when all the algorithms are combined using weighted votes. When only one individual algorithm is used, Naïve Bayes algorithm has the best results, then the Neural Network and the Decision Trees algorithms.

  7. Toward computerized morphometric facilities: a review of 58 software packages for computer-aided three-dimensional reconstruction, quantification, and picture generation from parallel serial sections

    NARCIS (Netherlands)

    Huijsmans, D. P.; Lamers, W. H.; Los, J. A.; Strackee, J.

    1986-01-01

    This review gives an inventory of 58 computer-aided three-dimensional reconstruction applications in the domain of biomedical research. It is devoted to the formulation of a set of recommendations thought to be necessary for improved performance of software packages in this field. These

  8. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  9. A Model for Joint Software Reviews

    Science.gov (United States)

    1998-10-01

    or more specified systems. Figure 5: Quality Characteristics [ ISO /IEC 9126 -1, 1996] Despite the lack of prior study and classification of IR issues...Company. [ ISO /IEC 9126 -1, 1996] Information Technology - Software quality characteristics and metrics - Part 1: Quality characteristics and sub...characteristics, Standard (No. ISO /IEC 9126 -1). [ ISO /IEC 12207, 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207

  10. Safety Analysis Report for Packaging, Y-12 National Security Complex, Model ES-3100 Package with Bulk HEU Contents

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, James [Y-12 National Security Complex, Oak Ridge, TN (United States); Goins, Monty [Y-12 National Security Complex, Oak Ridge, TN (United States); Paul, Pran [Y-12 National Security Complex, Oak Ridge, TN (United States); Wilkinson, Alan [Y-12 National Security Complex, Oak Ridge, TN (United States); Wilson, David [Y-12 National Security Complex, Oak Ridge, TN (United States)

    2015-09-03

    This safety analysis report for packaging (SARP) presents the results of the safety analysis prepared in support of the Consolidated Nuclear Security, LLC (CNS) request for licensing of the Model ES-3100 package with bulk highly enriched uranium (HEU) contents and issuance of a Type B(U) Fissile Material Certificate of Compliance. This SARP, published in the format specified in the Nuclear Regulatory Commission (NRC) Regulatory Guide 7.9 and using information provided in UCID-21218 and NRC Regulatory Guide 7.10, demonstrates that the Y-12 National Security Complex (Y-12) ES-3100 package with bulk HEU contents meets the established NRC regulations for packaging, preparation for shipment, and transportation of radioactive materials given in Title 10, Part 71, of the Code of Federal Regulations (CFR) [10 CFR 71] as well as U.S. Department of Transportation (DOT) regulations for packaging and shipment of hazardous materials given in Title 49 CFR. To protect the health and safety of the public, shipments of adioactive materials are made in packaging that is designed, fabricated, assembled, tested, procured, used, maintained, and repaired in accordance with the provisions cited above. Safety requirements addressed by the regulations that must be met when transporting radioactive materials are containment of radioactive materials, radiation shielding, and assurance of nuclear subcriticality.

  11. SCIATRAN 3.1: A new radiative transfer model and retrieval package

    Science.gov (United States)

    Rozanov, Alexei; Rozanov, Vladimir; Kokhanovsky, Alexander; Burrows, John P.

    The SCIATRAN 3.1 package is a result of further development of the SCIATRAN 2.X software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. After an implementation of the vector radiative transfer model in SCIATRAN 3.0 the spectral range covered by the model has been extended into the thermal infrared ranging to approximately 40 micrometers. Another major improvement has been done accounting for the underlying surface effects. Among others, a sophisticated representation of the water surface with a bidirectional reflection distribution function (BRDF) has been implemented accounting for the Fresnel reflection of the polarized light and for the effect of foam. A newly developed representation for a snow surface allows radiative transfer calculations to be performed within an unpolluted or soiled snow layer. Furthermore, a new approach has been implemented allowing radiative transfer calculations to be performed for a coupled atmosphere-ocean system. This means that, the underlying ocean is not considered as a purely reflecting surface any more. Instead, full radiative transfer calculations are performed within the water allowing the user to simulate the radiance within both the atmosphere and the ocean. Similar to previous versions, the simulations can be performed for any viewing geometry typi-cal for atmospheric observations in the UV-Vis-NIR-TIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer location within or outside the Earth's atmosphere including underwater observations. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new features of the radiative transfer model is given, including remarks on the availability for the scientific community. Furthermore, some application examples of the radiative transfer model are

  12. Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models

    Science.gov (United States)

    Chu, A.

    2014-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.

  13. Suitability of semi-automated tumor response assessment of liver metastases using a dedicated software package

    International Nuclear Information System (INIS)

    Kalkmann, Janine; Ladd, S.C.; Greiff, A. de; Forsting, M.; Stattaus, J.

    2010-01-01

    Purpose: to evaluate the suitability of semi-automated compared to manual tumor response assessment (TRA) of liver metastases. Materials and methods: in total, 32 patients with colorectal cancer and liver metastases were followed by an average of 2.8 contrast-enhanced CT scans. Two observers (O1, O2) measured the longest diameter (LD) of 269 liver metastases manually and semi-automatically using software installed as thin-client on a PACS workstation (LMS-Liver, MEDIAN Technologies). LD and TRA (''progressive'', ''stable'', ''partial remission'') were performed according to RECIST (Response Evaluation Criteria in Solid Tumors) and analyzed for between-method, interobserver and intraobserver variability. The time needed for evaluation was compared for both methods. Results: all measurements correlated excellently (r ≥ 0.96). Intraobserver (semi-automated), interobserver (manual) and between-method differences (by O1) in LD of 1.4 ± 2.6 mm, 1.9 ± 1.9 mm and 2.1 ± 2.0 mm, respectively, were not significant. Interobserver (semi-automated) and between-method (by O2) differences in LD of 3.0 ± 3.0 mm and 2.6 ± 2.0 mm, respectively, reflected a significant variability (p < 0.01). The interobserver agreement in manual and semi-automated TRA was 91.4%. The intraobserver agreement in semi-automated TRA was 84.5%. Between both methods a TRA agreement of 86.2% was obtained. Semi-automated evaluation (2.7 min) took slightly more time than manual evaluation (2.3 min). Conclusion: semi-automated and manual evaluation of liver metastases yield comparable results in response assessments and require comparable effort. (orig.)

  14. SPOTting model parameters using a ready-made Python package

    Science.gov (United States)

    Houska, Tobias; Kraft, Philipp; Breuer, Lutz

    2015-04-01

    The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for

  15. Beyond Reactive Planning: Self Adaptive Software and Self Modeling Software in Predictive Deliberation Management

    National Research Council Canada - National Science Library

    Lenahan, Jack; Nash, Michael P; Charles, Phil

    2008-01-01

    .... We present the following hypothesis: predictive deliberation management using self-adapting and self-modeling software will be required to provide mission planning adjustments after the start of a mission...

  16. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  17. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    OpenAIRE

    Jump, David

    2014-01-01

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing...

  18. Use of advanced modeling techniques to optimize thermal packaging designs.

    Science.gov (United States)

    Formato, Richard M; Potami, Raffaele; Ahmed, Iftekhar

    2010-01-01

    Through a detailed case study the authors demonstrate, for the first time, the capability of using advanced modeling techniques to correctly simulate the transient temperature response of a convective flow-based thermal shipper design. The objective of this case study was to demonstrate that simulation could be utilized to design a 2-inch-wall polyurethane (PUR) shipper to hold its product box temperature between 2 and 8 °C over the prescribed 96-h summer profile (product box is the portion of the shipper that is occupied by the payload). Results obtained from numerical simulation are in excellent agreement with empirical chamber data (within ±1 °C at all times), and geometrical locations of simulation maximum and minimum temperature match well with the corresponding chamber temperature measurements. Furthermore, a control simulation test case was run (results taken from identical product box locations) to compare the coupled conduction-convection model with a conduction-only model, which to date has been the state-of-the-art method. For the conduction-only simulation, all fluid elements were replaced with "solid" elements of identical size and assigned thermal properties of air. While results from the coupled thermal/fluid model closely correlated with the empirical data (±1 °C), the conduction-only model was unable to correctly capture the payload temperature trends, showing a sizeable error compared to empirical values (ΔT > 6 °C). A modeling technique capable of correctly capturing the thermal behavior of passively refrigerated shippers can be used to quickly evaluate and optimize new packaging designs. Such a capability provides a means to reduce the cost and required design time of shippers while simultaneously improving their performance. Another advantage comes from using thermal modeling (assuming a validated model is available) to predict the temperature distribution in a shipper that is exposed to ambient temperatures which were not bracketed

  19. The General-Use Nodal Network Solver (GUNNS) Modeling Package for Space Vehicle Flow System Simulation

    Science.gov (United States)

    Harvey, Jason; Moore, Michael

    2013-01-01

    The General-Use Nodal Network Solver (GUNNS) is a modeling software package that combines nodal analysis and the hydraulic-electric analogy to simulate fluid, electrical, and thermal flow systems. GUNNS is developed by L-3 Communications under the TS21 (Training Systems for the 21st Century) project for NASA Johnson Space Center (JSC), primarily for use in space vehicle training simulators at JSC. It has sufficient compactness and fidelity to model the fluid, electrical, and thermal aspects of space vehicles in real-time simulations running on commodity workstations, for vehicle crew and flight controller training. It has a reusable and flexible component and system design, and a Graphical User Interface (GUI), providing capability for rapid GUI-based simulator development, ease of maintenance, and associated cost savings. GUNNS is optimized for NASA's Trick simulation environment, but can be run independently of Trick.

  20. Application of the Finite Elemental Analysis to Modeling Temperature Change of the Vaccine in an Insulated Packaging Container during Transport.

    Science.gov (United States)

    Ge, Changfeng; Cheng, Yujie; Shen, Yan

    2013-01-01

    This study demonstrated an attempt to predict temperatures of a perishable product such as vaccine inside an insulated packaging container during transport through finite element analysis (FEA) modeling. In order to use the standard FEA software for simulation, an equivalent heat conduction coefficient is proposed and calculated to describe the heat transfer of the air trapped inside the insulated packaging container. The three-dimensional, insulated packaging container is regarded as a combination of six panels, and the heat flow at each side panel is a one-dimension diffusion process. The transit-thermal analysis was applied to simulate the heat transition process from ambient environment to inside the container. Field measurements were carried out to collect the temperature during transport, and the collected data were compared to the FEA simulation results. Insulated packaging containers are used to transport temperature-sensitive products such as vaccine and other pharmaceutical products. The container is usually made of an extruded polystyrene foam filled with gel packs. World Health Organization guidelines recommend that all vaccines except oral polio vaccine be distributed in an environment where the temperature ranges between +2 to +8 °C. The primary areas of concern in designing the packaging for vaccine are how much of the foam thickness and gel packs should be used in order to keep the temperature in a desired range, and how to prevent the vaccine from exposure to freezing temperatures. This study uses numerical simulation to predict temperature change within an insulated packaging container in vaccine cold chain. It is our hope that this simulation will provide the vaccine industries with an alternative engineering tool to validate vaccine packaging and project thermal equilibrium within the insulated packaging container.

  1. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  2. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  3. Capability Maturity Model (CMM) for Software Process Improvements

    Science.gov (United States)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  4. Modeling the oxygen diffusion of nanocomposite-based food packaging films.

    Science.gov (United States)

    Bhunia, Kanishka; Dhawan, Sumeet; Sablani, Shyam S

    2012-07-01

    Polymer-layered silicate nanocomposites have been shown to improve the gas barrier properties of food packaging polymers. This study developed a computer simulation model using the commercial software, COMSOL Multiphysics to analyze changes in oxygen barrier properties in terms of relative diffusivity, as influenced by configuration and structural parameters that include volume fraction (φ), aspect ratio (α), intercalation width (W), and orientation angle (θ) of nanoparticles. The simulation was performed at different φ (1%, 3%, 5%, and 7%), α (50, 100, 500, and 1000), and W (1, 3, 5, and 7 nm). The θ value was varied from 0° to 85°. Results show that diffusivity decreases with increasing volume fraction, but beyond φ = 5% and α = 500, diffusivity remained almost constant at W values of 1 and 3 nm. Higher relative diffusivity coincided with increasing W and decreasing α value for the same volume fraction of nanoparticles. Diffusivity increased as the rotational angle increased, gradually diminishing the influence of nanoparticles. Diffusivity increased drastically as θ changed from 15° to 30° (relative increment in relative diffusivity was almost 3.5 times). Nanoparticles with exfoliation configuration exhibited better oxygen barrier properties compared to intercalation. The finite element model developed in this study provides insight into oxygen barrier properties for nanocomposite with a wide range of structural parameters. This model can be used to design and manufacture an ideal nanocomposite-based food packaging film with improved gas barrier properties for industrial applications. The model will assist in designing nanocomposite polymeric structures of desired gas barrier properties for food packaging applications. In addition, this study will be helpful in formulating a combination of nanoparticle structural parameters for designing nanocomposite membranes with selective permeability for the industrial applications including membrane

  5. Software reliability growth model for safety systems of nuclear reactor

    International Nuclear Information System (INIS)

    Thirugnana Murthy, D.; Murali, N.; Sridevi, T.; Satya Murty, S.A.V.; Velusamy, K.

    2014-01-01

    The demand for complex software systems has increased more rapidly than the ability to design, implement, test, and maintain them, and the reliability of software systems has become a major concern for our, modern society.Software failures have impaired several high visibility programs in space, telecommunications, defense and health industries. Besides the costs involved, it setback the projects. The ways of quantifying it and using it for improvement and control of the software development and maintenance process. This paper discusses need for systematic approaches for measuring and assuring software reliability which is a major share of project development resources. It covers the reliability models with the concern on 'Reliability Growth'. It includes data collection on reliability, statistical estimation and prediction, metrics and attributes of product architecture, design, software development, and the operational environment. Besides its use for operational decisions like deployment, it includes guiding software architecture, development, testing and verification and validation. (author)

  6. PtProcess: An R Package for Modelling Marked Point Processes Indexed by Time

    Directory of Open Access Journals (Sweden)

    David Harte

    2010-10-01

    Full Text Available This paper describes the package PtProcess which uses the R statistical language. The package provides a unified approach to fitting and simulating a wide variety of temporal point process or temporal marked point process models. The models are specified by an intensity function which is conditional on the history of the process. The user needs to provide routines for calculating the conditional intensity function. Then the package enables one to carry out maximum likelihood fitting, goodness of fit testing, simulation and comparison of models. The package includes the routines for the conditional intensity functions for a variety of standard point process models. The package is intended to simplify the fitting of point process models indexed by time in much the same way as generalized linear model programs have simplified the fitting of various linear models. The primary examples used in this paper are earthquake sequences but the package is intended to have a much wider applicability.

  7. Software for Mathematical Modeling of Plastic Deformation in FCC Metals

    Science.gov (United States)

    Petelin, A. E.; Eliseev, A. S.

    2017-08-01

    The question on the necessity of software implementation in the study of plastic deformation in FCC metals with the use of mathematical modeling methods is investigated. This article describes the implementation features and the possibility of using the software Dislocation Dynamics of Crystallographic Slip (DDCS). The software has an advanced user interface and is designed for users without an extensive experience in IT-technologies. Parameter values of the mathematical model, obtained from field experiments and accumulated in a special database, are used in DDCS to carry out computational experiments. Moreover, the software is capable of accumulating bibliographic information used in research.

  8. Operation manual for EDXRDDA - a software package for Bragg peak analysis of energy dispersive powder X-ray diffraction data

    International Nuclear Information System (INIS)

    Jayaswal, Balhans; Vijaykumar, V.; Momin, S.N.; Sikka, S.K.

    1992-01-01

    EDXRDDA is a software package for analysis of raw data for energy dispersive x-ray diffraction from powder samples. It resolves the spectra into individual peaks by a constrained non-linear least squares method (Hughes and Sexton, 1988). The profile function adopted is the Gaussian/Lorentzian product with the mixing ratio refinable in the program. The program is implemented on an IBM PC and is highly interactive with extensive plotting facilities. This report is a user's guide for running the program. In the first step after inputting the spectra, the full spectra is plotted on the screen. The user then chooses a portion of this for peak resolution. The initial guess for the peak intensity, peak position are input with the help of a cursor or a mouse. Upto twenty peaks can be fitted at a time in an interval of 500 channels. For overlapping peaks, various constraints can be applied. Bragg peaks and fluorescence peaks with different half widths can be handled simultaneously. The program on execution produces a look up table which contains the refined values of the peak position, half width, peak intensity, integrated intensity, and their error estimates of each peak. The program is very general and can also be used for curve fitting of data from many other experiments. (author). 2 refs., 7 figs., 2 appendices

  9. Modeling Software Evolution using Algebraic Graph Rewriting

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Avgeriou, P.; Zdun, U.; Borne, I.

    We show how evolution requests can be formalized using algebraic graph rewriting. In particular, we present a way to convert the UML class diagrams to colored graphs. Since changes in software may effect the relation between the methods of classes, our colored graph representation also employs the

  10. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  11. Migration modeling to estimate exposure to chemicals in food packaging for application in highthroughput risk-based screening and Life Cycle Assessment

    DEFF Research Database (Denmark)

    Ernstoff, Alexi; Jolliet, O.; Huang, L.

    2017-01-01

    and risk prioritization and screening. To fulfill the need for a migration model flexibly suitable for such tools, we develop an accurate and rapid (high-throughput) approach. The developed model estimates the fraction of an organic chemical migrating from polymeric packaging into food for user-defined...... instantaneously estimates migration from packaging into food for user-defined scenarios, and has improved performance over common model simplifications. The common practice of setting the package-food partition coefficient = 1 for specific "worst-case" scenarios is insufficient to predict the equilibrium......Specialty software and simplified models are often used to estimate "worst-case" migration of potentially toxic chemicals from packaging into food. Current approaches, however, cannot efficiently and accurately provide estimates of migration for emerging applications, e.g. in Life Cycle Assessment...

  12. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  13. Software to Enable Modeling & Simulation as a Service

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a Modeling and Simulation as a Service (M&SaaS) software service infrastructure to enable most modeling and simulation (M&S) activities to be...

  14. Fullrmc, a rigid body Reverse Monte Carlo modeling package enabled with machine learning and artificial intelligence.

    Science.gov (United States)

    Aoun, Bachir

    2016-05-05

    A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group. © 2016 Wiley Periodicals, Inc.

  15. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  16. SET-MM – A Software Evaluation Technology Maturity Model

    OpenAIRE

    García-Castro, Raúl

    2011-01-01

    The application of software evaluation technologies in different research fields to verify and validate research is a key factor in the progressive evolution of those fields. Nowadays, however, to have a clear picture of the maturity of the technologies used in evaluations or to know which steps to follow in order to improve the maturity of such technologies is not easy. This paper describes a Software Evaluation Technology Maturity Model that can be used to assess software evaluation tech...

  17. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    Science.gov (United States)

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  18. Java-based graphical user interface for MRUI, a software package for quantitation of in vivo/medical magnetic resonance spectroscopy signals.

    Science.gov (United States)

    Naressi, A; Couturier, C; Castang, I; de Beer, R; Graveron-Demilly, D

    2001-07-01

    This article describes a Java-based graphical user interface for the magnetic resonance user interface (MRUI) quantitation package. This package allows MR spectroscopists to easily perform time-domain analysis of in vivo/medical MR spectroscopy data. We have found that the Java programming language is very well suited for developing highly interactive graphical software applications such as the MRUI system. We also have established that MR quantitation algorithms, programmed in the past in other languages, can easily be embedded into the Java-based MRUI by using the Java native interface (JNI).

  19. AGING PERFORMANCE OF MODEL 9975 PACKAGE FLUOROELASTOMER O-RINGS

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, E.; Daugherty, W.; Skidmore, E.; Dunn, K.; Fisher, D.

    2011-05-31

    The influence of temperature and radiation on Viton{reg_sign} GLT and GLT-S fluoroelastomer O-rings is an ongoing research focus at the Savannah River National Laboratory. The O-rings are credited for leaktight containment in the Model 9975 shipping package used for transportation of plutonium-bearing materials. At the Savannah River Site, the Model 9975 packages are being used for interim storage. Primary research efforts have focused on surveillance of O-rings from actual packages, leak testing of seals at bounding aging conditions and the effect of aging temperature on compression stress relaxation behavior, with the goal of service life prediction for long-term storage conditions. Recently, an additional effort to evaluate the effect of aging temperature on the oxidation of the materials has begun. Degradation in the mechanical properties of elastomers is directly related to the oxidation of the polymer. Sensitive measurements of the oxidation rate can be performed in a more timely manner than waiting for a measurable change in mechanical properties, especially at service temperatures. Measuring the oxidation rate therefore provides a means to validate the assumption that the degradation mechanisms(s) do not change from the elevated temperatures used for accelerated aging and the lower service temperatures. Monitoring the amount of oxygen uptake by the material over time at various temperatures can provide increased confidence in lifetime predictions. Preliminary oxygen consumption analysis of a Viton GLT-based fluoroelastomer compound (Parker V0835-75) using an Oxzilla II differential oxygen analyzer in the temperature range of 40-120 C was performed. Early data suggests oxygen consumption rates may level off within the first 100,000 hours (10-12 years) at 40 C and that sharp changes in the degradation mechanism (stress-relaxation) are not expected over the temperature range examined. This is consistent with the known long-term heat aging resistance of

  20. The R Package threg to Implement Threshold Regression Models

    Directory of Open Access Journals (Sweden)

    Tao Xiao

    2015-08-01

    This new package includes four functions: threg, and the methods hr, predict and plot for threg objects returned by threg. The threg function is the model-fitting function which is used to calculate regression coefficient estimates, asymptotic standard errors and p values. The hr method for threg objects is the hazard-ratio calculation function which provides the estimates of hazard ratios at selected time points for specified scenarios (based on given categories or value settings of covariates. The predict method for threg objects is used for prediction. And the plot method for threg objects provides plots for curves of estimated hazard functions, survival functions and probability density functions of the first-hitting-time; function curves corresponding to different scenarios can be overlaid in the same plot for comparison to give additional research insights.

  1. Preservation of information in a prebiotic package model.

    Science.gov (United States)

    Silvestre, Daniel A M M; Fontanari, José F

    2007-05-01

    The coexistence between different informational molecules has been the preferred mode to circumvent the limitation posed by imperfect replication on the amount of information stored by each of these molecules. Here we reexamine a classic package model in which distinct information carriers or templates are forced to coexist within vesicles, which in turn can proliferate freely through binary division. The combined dynamics of vesicles and templates is described by a multitype branching process which allows us to write equations for the average number of the different types of vesicles as well as for their extinction probabilities. The threshold phenomenon associated with the extinction of the vesicle population is studied quantitatively using finite-size scaling techniques. We conclude that the resultant coexistence is too frail in the presence of parasites and so confinement of templates in vesicles without an explicit mechanism of cooperation does not resolve the information crisis of prebiotic evolution.

  2. Model for Agile Software Development Performance Monitoring

    OpenAIRE

    Žabkar, Nataša

    2013-01-01

    Agile methodologies have been in use for more than ten years and during this time they proved to be efficient, even though number of empirical research is scarce, especially regarding agile software development performance monitoring. The most popular agile framework Scrum is using only one measure of performance: the amount of work remaining for implementation of User Story from the Product Backlog or for implementation of Task from the Sprint Backlog. In time the need for additional me...

  3. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  4. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  5. Software engineering with process algebra: Modelling client / server architecures

    NARCIS (Netherlands)

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the

  6. EQPT, a data file preprocessor for the EQ3/6 software package: User`s guide and related documentation (Version 7.0); Part 2

    Energy Technology Data Exchange (ETDEWEB)

    Daveler, S.A.; Wolery, T.J.

    1992-12-17

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.

  7. EQPT, a data file preprocessor for the EQ3/6 software package: User's guide and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Daveler, S.A.; Wolery, T.J.

    1992-01-01

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer's (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25 degrees C only to 0-300 degrees C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer's equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer's equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers

  8. The scientific modeling assistant: An advanced software tool for scientific model building

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  9. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  10. Aluminum Laminates in Beverage Packaging: Models and Experiences

    Directory of Open Access Journals (Sweden)

    Gabriella Bolzon

    2015-08-01

    Full Text Available Aluminum laminates are among the main components of beverage packaging. These layered material systems are coupled to paperboard plies except in the cap opening area, where the human force limit sets a requirement on the material properties to allow open-ability and the mechanical characteristics are of particular interest. Experimental investigations have been carried out on this composite and on its components by either traditional or full-field measurement techniques. The interpretation of the collected data has been supported by the simulation of the performed tests considering either a homogenized material model or the individual laminate layers. However, different results may be recovered from similar samples due to physical factors like the material processing route and the embedded defectiveness. In turn, the conclusions may vary depending on the model assumptions. This contribution focuses on the physical effects and on the modeling of the large localized deformation induced by material singularities. This topic is discussed at the light of some experimental results.

  11. Thermal expansion model for multiphase electronic packaging materials

    International Nuclear Information System (INIS)

    Allred, B.E.; Warren, W.E.

    1991-01-01

    Control of thermal expansion is often necessary in the design and selection of electronic packages. In some instances, it is desirable to have a coefficient of thermal expansion intermediate between values readily attainable with single or two phase materials. The addition of a third phase in the form of fillers, whiskers, or fibers can be used to attain intermediate expansions. To help design the thermal expansion of multiphase materials for specific applications, a closed form model has been developed that accurately predicts the effective elastic properties of isotropic filled materials and transversely isotropic lamina. Properties of filled matrix materials are used as inputs to the lamina model to obtain the composite elastic properties as a function of the volume fraction of each phase. Hybrid composites with two or more fiber types are easily handled with this model. This paper reports that results for glass, quartz, and Kevlar fibers with beta-eucryptite filled polymer matrices show good agreement with experimental results for X, Y, and Z thermal expansion coefficients

  12. Thermo-mechanical model optimization of HB-LED packaging

    NARCIS (Netherlands)

    Yuan, C.A.; Erinc, M.; Gielen, A.W.J.; Waal, A. van der; Driel, W. van; Zhang, K.

    2011-01-01

    Lighting is an advancing phenomenon both on the technology and on the market level due to the rapid development of the solid state lighting technology. The efforts in improving the efficacy of high brightness LED's (HB-LED) have concentrated on the packaging architecture. Packaging plays a

  13. An evaluation of the psychometric properties of the Purdue Pharmacist Directive Guidance Scale using SPSS and R software packages.

    Science.gov (United States)

    Marr-Lyon, Lisa R; Gupchup, Gireesh V; Anderson, Joe R

    2012-01-01

    The Purdue Pharmacist Directive Guidance (PPDG) Scale was developed to assess patients' perceptions of the level of pharmacist-provided (1) instruction and (2) feedback and goal-setting-2 aspects of pharmaceutical care. Calculations of its psychometric properties stemming from SPSS and R were similar, but distinct differences were apparent. Using SPSS and R software packages, researchers aimed to examine the construct validity of the PPDG using a higher order factoring procedure; in tandem, McDonald's omega and Cronbach's alpha were calculated as means of reliability analyses. Ninety-nine patients with either type I or type II diabetes, aged 18 years or older, able to read and write English, and who could provide written-informed consent participated in the study. Data were collected in 8 community pharmacies in New Mexico. Using R, (1) a principal axis factor analysis with promax (oblique) rotation was conducted, (2) a Schmid-Leiman transformation was attained, and (3) McDonald's omega and Cronbach's alpha were computed. Using SPSS, subscale findings were validated by conducting a principal axis factor analysis with promax rotation; strict parallels and Cronbach's alpha reliabilities were calculated. McDonald's omega and Cronbach's alpha were robust, with coefficients greater than 0.90; principal axis factor analysis with promax rotation revealed construct similarities with an overall general factor emerging from R. Further subjecting the PPDG to rigorous psychometric testing revealed stronger quantitative support of the overall general factor of directive guidance and subscales of instruction and feedback and goal-setting. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Model analysis of fresh fuel cask with ANSYS 10.0 software

    International Nuclear Information System (INIS)

    Seyed Aboulfazl Azimfar; Arash Kazemi

    2009-01-01

    The Fresh Fuel for BNPP-1 is due to be transported by special containers which are supposed to be designed in a manner to withstand against stresses and impacts in order to protect the fuel from any possible damage. A static analysis calculates the effects of steady loading conditions on a structure, while ignoring inertia and damping effects, such as those caused by time-varying loads. A static analysis can, however, include steady inertia loads (such as gravity and rotational velocity), and time-varying loads that can be approximated as static equivalent loads. in this paper the computer model of PCS was developed to estimate the safety of the package, in structural static analysis, as well as structural strength of one single or more combined packages to be transported by automobile, rail and air. Safety factor and stresses and strains were calculated by ANSYS software and compared with Russian standards. (Author)

  15. FALCON: a software package for analysis of nestedness in bipartite networks [v1; ref status: indexed, http://f1000r.es/3z8

    Directory of Open Access Journals (Sweden)

    Stephen J. Beckett

    2014-08-01

    Full Text Available Nestedness is a statistical measure used to interpret bipartite interaction data in several ecological and evolutionary contexts, e.g. biogeography (species-site relationships and species interactions (plant-pollinator and host-parasite networks. Multiple methods have been used to evaluate nestedness, which differ in how the metrics for nestedness are determined. Furthermore, several different null models have been used to calculate statistical significance of nestedness scores. The profusion of measures and null models, many of which give conflicting results, is problematic for comparison of nestedness across different studies. We developed the FALCON software package to allow easy and efficient comparison of nestedness scores and statistical significances for a given input network, using a selection of the more popular measures and null models from the current literature. FALCON currently includes six measures and five null models for nestedness in binary networks, and two measures and four null models for nestedness in weighted networks. The FALCON software is designed to be efficient and easy to use. FALCON code is offered in three languages (R, MATLAB, Octave and is designed to be modular and extensible, enabling users to easily expand its functionality by adding further measures and null models. FALCON provides a robust methodology for comparing the strength and significance of nestedness in a given bipartite network using multiple measures and null models. It includes an “adaptive ensemble” method to reduce undersampling of the null distribution when calculating statistical significance. It can work with binary or weighted input networks. FALCON is a response to the proliferation of different nestedness measures and associated null models in the literature. It allows easy and efficient calculation of nestedness scores and statistical significances using different methods, enabling comparison of results from

  16. Transformation of UML Behavioral Diagrams to Support Software Model Checking

    Directory of Open Access Journals (Sweden)

    Luciana Brasil Rebelo dos Santos

    2014-04-01

    Full Text Available Unified Modeling Language (UML is currently accepted as the standard for modeling (object-oriented software, and its use is increasing in the aerospace industry. Verification and Validation of complex software developed according to UML is not trivial due to complexity of the software itself, and the several different UML models/diagrams that can be used to model behavior and structure of the software. This paper presents an approach to transform up to three different UML behavioral diagrams (sequence, behavioral state machines, and activity into a single Transition System to support Model Checking of software developed in accordance with UML. In our approach, properties are formalized based on use case descriptions. The transformation is done for the NuSMV model checker, but we see the possibility in using other model checkers, such as SPIN. The main contribution of our work is the transformation of a non-formal language (UML to a formal language (language of the NuSMV model checker towards a greater adoption in practice of formal methods in software development.

  17. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    Science.gov (United States)

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society

  18. ITS Version 3.0: Powerful, user-friendly software for radiation modelling

    International Nuclear Information System (INIS)

    Kensek, R.P.; Halbleib, J.A.; Valdez, G.D.

    1993-01-01

    ITS (the Integrated Tiger Series) is a powerful, but user-friendly, software package permitting state-of-the-art modelling of electron and/or photon radiation effects. The programs provide Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields. The ITS system combines operational simplicity and physical accuracy in order to provide experimentalist and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems

  19. Study of the nonlinear imperfect software debugging model

    International Nuclear Information System (INIS)

    Wang, Jinyong; Wu, Zhibo

    2016-01-01

    In recent years there has been a dramatic proliferation of research on imperfect software debugging phenomena. Software debugging is a complex process and is affected by a variety of factors, including the environment, resources, personnel skills, and personnel psychologies. Therefore, the simple assumption that debugging is perfect is inconsistent with the actual software debugging process, wherein a new fault can be introduced when removing a fault. Furthermore, the fault introduction process is nonlinear, and the cumulative number of nonlinearly introduced faults increases over time. Thus, this paper proposes a nonlinear, NHPP imperfect software debugging model in consideration of the fact that fault introduction is a nonlinear process. The fitting and predictive power of the NHPP-based proposed model are validated through related experiments. Experimental results show that this model displays better fitting and predicting performance than the traditional NHPP-based perfect and imperfect software debugging models. S-confidence bounds are set to analyze the performance of the proposed model. This study also examines and discusses optimal software release-time policy comprehensively. In addition, this research on the nonlinear process of fault introduction is significant given the recent surge of studies on software-intensive products, such as cloud computing and big data. - Highlights: • Fault introduction is a nonlinear changing process during the debugging phase. • The assumption that the process of fault introduction is nonlinear is credible. • Our proposed model can better fit and accurately predict software failure behavior. • Research on fault introduction case is significant to software-intensive products.

  20. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  1. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  2. A Reference Model for Mobile Social Software for Learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2007-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model for mobile social software for learning. International Journal of Continuing Engineering Education and Life-Long Learning, 18(1), 118-138.

  3. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  4. Model-Driven Software Evolution : A Research Agenda

    NARCIS (Netherlands)

    Van Deursen, A.; Visser, E.; Warmer, J.

    2007-01-01

    Software systems need to evolve, and systems built using model-driven approaches are no exception. What complicates model-driven engineering is that it requires multiple dimensions of evolution. In regular evolution, the modeling language is used to make the changes. In meta-model evolution, changes

  5. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  6. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  7. A CASE STUDY OF MODELING A TORUS IN DIFFERENT MODELING SOFTWARES

    OpenAIRE

    VELJKOVIĆ Milica; KRASIĆ Sonja; PEJIĆ Petar; TOŠIĆ Zlata

    2017-01-01

    Modeling of the complex geometric shapes requires the use of appropriate softwares. This study analyzes the process of modeling with two different computer softwares, AutoCAD and Rhinoceros. The aim is to demonstrate the similarities and differences between these softwares when used for modeling torus, a double curved geometric surface. The two modeling processes are compared in order to investigate the potentials of these softwares in the modeling of an architectural structure comprising a s...

  8. FDSTools: A software package for analysis of massively parallel sequencing data with the ability to recognise and correct STR stutter and other PCR or sequencing noise.

    Science.gov (United States)

    Hoogenboom, Jerry; van der Gaag, Kristiaan J; de Leeuw, Rick H; Sijen, Titia; de Knijff, Peter; Laros, Jeroen F J

    2017-03-01

    Massively parallel sequencing (MPS) is on the advent of a broad scale application in forensic research and casework. The improved capabilities to analyse evidentiary traces representing unbalanced mixtures is often mentioned as one of the major advantages of this technique. However, most of the available software packages that analyse forensic short tandem repeat (STR) sequencing data are not well suited for high throughput analysis of such mixed traces. The largest challenge is the presence of stutter artefacts in STR amplifications, which are not readily discerned from minor contributions. FDSTools is an open-source software solution developed for this purpose. The level of stutter formation is influenced by various aspects of the sequence, such as the length of the longest uninterrupted stretch occurring in an STR. When MPS is used, STRs are evaluated as sequence variants that each have particular stutter characteristics which can be precisely determined. FDSTools uses a database of reference samples to determine stutter and other systemic PCR or sequencing artefacts for each individual allele. In addition, stutter models are created for each repeating element in order to predict stutter artefacts for alleles that are not included in the reference set. This information is subsequently used to recognise and compensate for the noise in a sequence profile. The result is a better representation of the true composition of a sample. Using Promega Powerseq™ Auto System data from 450 reference samples and 31 two-person mixtures, we show that the FDSTools correction module decreases stutter ratios above 20% to below 3%. Consequently, much lower levels of contributions in the mixed traces are detected. FDSTools contains modules to visualise the data in an interactive format allowing users to filter data with their own preferred thresholds. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  9. A New Software Quality Model for Evaluating COTS Components

    OpenAIRE

    Adnan Rawashdeh; Bassem Matalkah

    2006-01-01

    Studies show that COTS-based (Commercial off the shelf) systems that are being built recently are exceeding 40% of the total developed software systems. Therefore, a model that ensures quality characteristics of such systems becomes a necessity. Among the most critical processes in COTS-based systems are the evaluation and selection of the COTS components. There are several existing quality models used to evaluate software systems in general; however, none of them is dedicated to COTS-based s...

  10. Staying in the Light: Evaluating Sustainability Models for Brokering Software

    Science.gov (United States)

    Powers, L. A.; Benedict, K. K.; Best, M.; Fyfe, S.; Jacobs, C. A.; Michener, W. K.; Pearlman, J.; Turner, A.; Nativi, S.

    2015-12-01

    The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and obstacles, and policy and legal considerations. The issue of sustainability is not unique to brokering software and these models may be relevant to many applications. Results of this comprehensive analysis highlight advantages and disadvantages of the various models in respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis while recognizing that all software is part of an evolutionary process and has a lifespan.

  11. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  12. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  13. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    H. Yanagi

    2016-06-01

    Full Text Available UAV (Unmanned Aerial Vehicle photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  14. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  15. Biochemical Network Stochastic Simulator (BioNetS: software for stochastic modeling of biochemical networks

    Directory of Open Access Journals (Sweden)

    Elston Timothy C

    2004-03-01

    Full Text Available Abstract Background Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. Results We have developed the software package Biochemical Network Stochastic Simulator (BioNetS for efficientlyand accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solvesthe appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. Conclusions We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  16. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes.

    Science.gov (United States)

    Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel

    2011-05-23

    Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model

  17. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes

    Directory of Open Access Journals (Sweden)

    Steyerberg Ewout W

    2011-05-01

    Full Text Available Abstract Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI enrolled in eight Randomized Controlled Trials (RCTs and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4, Stata (GLLAMM, SAS (GLIMMIX and NLMIXED, MLwiN ([R]IGLS and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC, R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal models for the main study and when based on a relatively large number of level-1 (patient level data compared to the number of level-2 (hospital level data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in

  18. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  19. Aspect-Oriented Model-Driven Software Product Line Engineering

    Science.gov (United States)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  20. Business Model Exploration for Software Defined Networks

    NARCIS (Netherlands)

    Xu, Yudi; Jansen, Slinger; España, Sergio; Zhang, Dong; Gao, Xuesong

    2017-01-01

    Business modeling is becoming a foundational process in the information technology industry. Many ICT companies are constructing their business models to stay competitive on the cutting edge of the technology world. However, when comes to new technologies or emerging markets, it remains difficult

  1. Capabilities and accuracy of energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-11-01

    Full Text Available Energy modelling can be used in a number of different ways to fulfill different needs, including certification within building regulations or green building rating tools. Energy modelling can also be used in order to try and predict what the energy...

  2. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  3. The STAMP Software for State Space Models

    Directory of Open Access Journals (Sweden)

    Roy Mendelssohn

    2011-05-01

    Full Text Available This paper reviews the use of STAMP (Structural Time Series Analyser, Modeler and Predictor for modeling time series data using state-space methods with unobserved components. STAMP is a commercial, GUI-based program that runs on Windows, Linux and Macintosh computers as part of the larger OxMetrics System. STAMP can estimate a wide-variety of both univariate and multivariate state-space models, provides a wide array of diagnostics, and has a batch mode capability. The use of STAMP is illustrated for the Nile river data which is analyzed throughout this issue, as well as by modeling a variety of oceanographic and climate related data sets. The analyses of the oceanographic and climate data illustrate the breadth of models available in STAMP, and that state-space methods produce results that provide new insights into important scientific problems.

  4. Building a Flexible Software Factory Using Partial Domain Specific Models

    NARCIS (Netherlands)

    Warmer, J.B.; Kleppe, A.G.

    2006-01-01

    This paper describes some experiences in building a software factory by defining multiple small domain specific languages (DSLs) and having multiple small models per DSL. This is in high contrast with traditional approaches using monolithic models, e.g. written in UML. In our approach, models behave

  5. Introduction to Financial Projection Models. Business Management Instructional Software.

    Science.gov (United States)

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  6. gamboostLSS: An R Package for Model Building and Variable Selection in the GAMLSS Framework

    OpenAIRE

    Hofner, Benjamin; Mayr, Andreas; Schmid, Matthias

    2014-01-01

    Generalized additive models for location, scale and shape are a flexible class of regression models that allow to model multiple parameters of a distribution function, such as the mean and the standard deviation, simultaneously. With the R package gamboostLSS, we provide a boosting method to fit these models. Variable selection and model choice are naturally available within this regularized regression framework. To introduce and illustrate the R package gamboostLSS and its infrastructure, we...

  7. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  8. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  9. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  10. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    International Nuclear Information System (INIS)

    Ilander, T.; Kansanaho, A.; Toivonen, H.

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.)

  11. An evaluation of the documented requirements of the SSP UIL and a review of commercial software packages for the development and testing of UIL prototypes

    Science.gov (United States)

    Gill, Esther Naomi

    1986-01-01

    A review was conducted of software packages currently on the market which might be integrated with the interface language and aid in reaching the objectives of customization, standardization, transparency, reliability, maintainability, language substitutions, expandability, portability, and flexibility. Recommendations are given for best choices in hardware and software acquisition for inhouse testing of these possible integrations. Software acquisition in the line of tools to aid expert-system development and/or novice program development, artificial intelligent voice technology and touch screen or joystick or mouse utilization as well as networking were recommended. Other recommendations concerned using the language Ada for the user interface language shell because of its high level of standardization, structure, and ability to accept and execute programs written in other programming languages, its DOD ownership and control, and keeping the user interface language simple so that multiples of users will find the commercialization of space within their realm of possibility which is, after all, the purpose of the Space Station.

  12. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  13. Model 9975 Life Extension Package 1 - Final Report

    International Nuclear Information System (INIS)

    Daugherty, W.

    2011-01-01

    Life extension package LE1 (9975-03382) was instrumented and subjected to a temperature/humidity environment that bounds KAMS package storage conditions for 92 weeks. During this time, the maximum fiberboard temperature was ∼180 F, and was established by a combination of internal heat (12 watts) and external heat (∼142 F). The relative humidity external to the package was maintained at 80 %RH. This package was removed from test in November 2010 after several degraded conditions were observed during a periodic examination. These conditions included degraded fiberboard (easily broken, bottom layer stuck to the drum), corrosion of the drum, and separation of the air shield from the upper fiberboard assembly. Several tests and parameters were used to characterize the package components. Results from these tests generally indicate agreement between this full-scale shipping package and small-scale laboratory tests on fiberboard and O-ring samples. These areas of agreement include the rate of fiberboard weight loss, change in fiberboard thermal conductivity, fiberboard compression strength, and O-ring compression set. In addition, this package provides an example of the extent to which moisture within the fiberboard can redistribute in the presence of a temperature gradient such as might be created by a 12 watt internal heat load. Much of the moisture near the fiberboard ID surface migrated towards the OD surface, but there was not a significant axial moisture gradient during most of the test duration. Only during the last inspection period (i.e. after 92 weeks exposure during the second phase) did enough moisture migrate to the bottom fiberboard layers to cause saturation. A side effect of moisture migration is the leaching of soluble compounds from the fiberboard. In particular, the corrosion observed on the drum appears related primarily to the leaching and concentration of chlorides. In most locations, this attack appears to be general corrosion, with shallow

  14. Presenting results of software model checker via debugging interface

    OpenAIRE

    Kohan, Tomáš

    2012-01-01

    Title: Presenting results of software model checker via debugging interface Author: Tomáš Kohan Department: Department of Software Engineering Supervisor of the master thesis: RNDr. Ondřej Šerý, Ph.D., Department of Distributed and Dependable Systems Abstract: This thesis is devoted to design and implementation of the new debugging interface of the Java PathFinder application. As a suitable inte- face container was selected the Eclipse development environment. The created interface should vis...

  15. Analysis of Ecodesign Implementation and Solutions for Packaging Waste System by Using System Dynamics Modeling

    Science.gov (United States)

    Berzina, Alise; Dace, Elina; Bazbauers, Gatis

    2010-01-01

    This paper discusses the findings of a research project which explored the packaging waste management system in Latvia. The paper focuses on identifying how the policy mechanisms can promote ecodesign implementation and material efficiency improvement and therefore reduce the rate of packaging waste accumulation in landfill. The method used for analyzing the packaging waste management policies is system dynamics modeling. The main conclusion is that the existing legislative instruments can be used to create an effective policy for ecodesign implementation but substantially higher tax rates on packaging materials and waste disposal than the existing have to be applied.

  16. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    Science.gov (United States)

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. PC-CIMACT. A near real time materials accountancy software package for use on an IBM or compatible PC

    International Nuclear Information System (INIS)

    Williams, D.E.; Gale, R.

    1990-03-01

    This report describes the 'PC-CIMACT' Near Real Time Materials Accountancy computer package. It has been derived from 'CIMACT', which is in daily use at the UKAEA's Dounreay Nuclear Power Establishment. The scope of the package is presented, together with the statistical analyses it encompasses. Several of the analyses are illustrated by the treatment of data from a simulated reprocessing campaign. A user guide providing detailed instructions is also included. (author)

  18. Python package for model STructure ANalysis (pySTAN)

    Science.gov (United States)

    Van Hoey, Stijn; van der Kwast, Johannes; Nopens, Ingmar; Seuntjens, Piet

    2013-04-01

    methods on a fair basis. We developed and present pySTAN (python framework for STructure Analysis), a python package containing a set of functions for model structure evaluation to provide the analysis of (hydrological) model structures. A selected set of algorithms for optimization, uncertainty and sensitivity analysis is currently available, together with a set of evaluation (objective) functions and input distributions to sample from. The methods are implemented model-independent and the python language provides the wrapper functions to apply administer external model codes. Different objective functions can be considered simultaneously with both statistical metrics and more hydrology specific metrics. By using so-called reStructuredText (sphinx documentation generator) and Python documentation strings (docstrings), the generation of manual pages is semi-automated and a specific environment is available to enhance both the readability and transparency of the code. It thereby enables a larger group of users to apply and compare these methods and to extend the functionalities.

  19. Regression Models for Description of Roasted Ground Coffee Powder Color Change during Secondary Shelf-Life as Related to Storage Conditions and Packaging Material

    Directory of Open Access Journals (Sweden)

    Maja Benković

    2018-02-01

    Full Text Available Besides sensory attributes, color is a parameter affecting consumers’ perception of the powdered coffee product or brew. The aim of this study was to develop and compare non-linear and linear regression models for the description of experimentally determined color changes during 6 months of storage in two different packaging materials. Model parameters were estimated using two software packages: Eureqa Formulize (Nutonian, Inc., Boston, MA, USA and Statistica 10.0 (StatSoft, Palo Alto, CA, USA and compared based on their R2 goodness of fit. Both non-linear and linear models used in this study pointed to a significant influence of intrinsic (sample moisture content and external (relative humidity (RH and temperature factors on ground roasted coffee color change. Non-linear model was the most suitable for description of color changes during storage. Based on lower moisture sorption of the sample packed in triplex bag, triplex packaging is proposed as the more suitable one.

  20. An Integrated Package of Neuromusculoskeletal Modeling Tools in Simulink (TM)

    National Research Council Canada - National Science Library

    Davoodi, R

    2001-01-01

    .... Blocks representing the skeletal linkage, sensors, muscles, and neural controllers are developed using separate software tools and integrated in the powerful simulation environment of Simulink (Mathworks Inc., USA...

  1. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  2. The use of the SIWARE model package in water management scenario studies, Egypt

    NARCIS (Netherlands)

    Abdel Gawad, S.T.; Abdel Khalek, M.A.; Smit, M.F.R.; Abdel Dayem, M.S.

    1995-01-01

    The SIWARE model package, which stands for Simulation of Water Management in the Arab Republic of Egypt, has been developed as a decision-support system for the Irrigation and Planning Sectors of the Egyptian Ministry of Public Works and Water Resources. The package was implemented within the

  3. spate: An R Package for Spatio-Temporal Modeling with a Stochastic Advection-Diffusion Process

    Directory of Open Access Journals (Sweden)

    Fabio Sigrist

    2015-02-01

    This package aims at providing tools for simulating and modeling of spatio-temporal processes using an SPDE based approach. The package contains functions for obtaining parametrizations, such as propagator or innovation covariance matrices, of the spatio-temporal model. This allows for building customized hierarchical Bayesian models using the SPDE based model at the process stage. The functions of the package then provide computationally efficient algorithms needed for doing inference with the hierarchical model. Furthermore, an adaptive Markov chain Monte Carlo (MCMC algorithm implemented in the package can be used as an algorithm for doing inference without any additional modeling. This function is flexible and allows for application specific customizing. The MCMC algorithm supports data that follow a Gaussian or a censored distribution with point mass at zero. Spatio-temporal covariates can be included in the model through a regression term.

  4. Bayesian Model Averaging Employing Fixed and Flexible Priors: The BMS Package for R

    Directory of Open Access Journals (Sweden)

    Stefan Zeugner

    2015-11-01

    Full Text Available This article describes the BMS (Bayesian model sampling package for R that implements Bayesian model averaging for linear regression models. The package excels in allowing for a variety of prior structures, among them the "binomial-beta" prior on the model space and the so-called "hyper-g" specifications for Zellner's g prior. Furthermore, the BMS package allows the user to specify her own model priors and offers a possibility of subjective inference by setting "prior inclusion probabilities" according to the researcher's beliefs. Furthermore, graphical analysis of results is provided by numerous built-in plot functions of posterior densities, predictive densities and graphical illustrations to compare results under different prior settings. Finally, the package provides full enumeration of the model space for small scale problems as well as two efficient MCMC (Markov chain Monte Carlo samplers that sort through the model space when the number of potential covariates is large.

  5. Utilizing Visual Effects Software for Efficient and Flexible Isostatic Adjustment Modelling

    Science.gov (United States)

    Meldgaard, A.; Nielsen, L.; Iaffaldano, G.

    2017-12-01

    The isostatic adjustment signal generated by transient ice sheet loading is an important indicator of past ice sheet extent and the rheological constitution of the interior of the Earth. Finite element modelling has proved to be a very useful tool in these studies. We present a simple numerical model for 3D visco elastic Earth deformation and a new approach to the design of such models utilizing visual effects software designed for the film and game industry. The software package Houdini offers an assortment of optimized tools and libraries which greatly facilitate the creation of efficient numerical algorithms. In particular, we make use of Houdini's procedural work flow, the SIMD programming language VEX, Houdini's sparse matrix creation and inversion libraries, an inbuilt tetrahedralizer for grid creation, and the user interface, which facilitates effortless manipulation of 3D geometry. We mitigate many of the time consuming steps associated with the authoring of efficient algorithms from scratch while still keeping the flexibility that may be lost with the use of commercial dedicated finite element programs. We test the efficiency of the algorithm by comparing simulation times with off-the-shelf solutions from the Abaqus software package. The algorithm is tailored for the study of local isostatic adjustment patterns, in close vicinity to present ice sheet margins. In particular, we wish to examine possible causes for the considerable spatial differences in the uplift magnitude which are apparent from field observations in these areas. Such features, with spatial scales of tens of kilometres, are not resolvable with current global isostatic adjustment models, and may require the inclusion of local topographic features. We use the presented algorithm to study a near field area where field observations are abundant, namely, Disko Bay in West Greenland with the intention of constraining Earth parameters and ice thickness. In addition, we assess how local

  6. The laws of software process a new model for the production and management of software

    CERN Document Server

    Armour, Phillip G

    2003-01-01

    The Nature of Software and The Laws of Software ProcessA Brief History of KnowledgeThe Characteristics of Knowledge Storage MediaThe Nature of Software DevelopmentThe Laws of Software Process and the Five Orders of IgnoranceThe Laws of Software ProcessThe First Law of Software ProcessThe Corollary to the First Law of Software ProcessThe Reflexive Creation of Systems and ProcessesThe Lemma of Eternal LatenessThe Second Law of Software ProcessThe Rule of Process BifurcationThe Dual Hypotheses of Knowledge DiscoveryArmour's Observation on Software ProcessThe Third Law of Software Process (also kn

  7. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  8. Coevolution of variability models and related software artifacts

    DEFF Research Database (Denmark)

    Passos, Leonardo; Teixeira, Leopoldo; Dinztner, Nicolas

    2015-01-01

    models coevolve with other artifact types, we study a large and complex real-world variant-rich software system: the Linux kernel. Specifically, we extract variability-coevolution patterns capturing changes in the variability model of the Linux kernel with subsequent changes in Makefiles and C source...

  9. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  10. Invention software support by integrating function and mathematical modeling

    NARCIS (Netherlands)

    Chechurin, L.S.; Wits, Wessel Willems; Bakker, H.M.

    2015-01-01

    New idea generation is imperative for successful product innovation and technology development. This paper presents the development of a novel type of invention support software. The support tool integrates both function modeling and mathematical modeling, thereby enabling quantitative analyses on a

  11. Application of Process Modeling in a Software- Engineering Course

    Directory of Open Access Journals (Sweden)

    Gabriel Alberto García Mireles

    2001-11-01

    Full Text Available Coordination in a software development project is a critical issue in delivering a successful software product, within the constraints of time, functionality and budget agreed upon with the customer. One of the strategies for approaching this problem consists in the use of process modeling to document, evaluate, and redesign the software development process. The appraisal of the projects done in the Engineering and Methodology course of a program given at the Ensenada Center of Scientific Research and Higher Education (CICESE, from a process perspective, facilitated the identification of strengths and weaknesses in the development process used. This paper presents the evaluation of the practical portion of the course, the improvements made, and the preliminary results of using the process approach in the analysis phase of a software-development project.

  12. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. gamboostLSS: An R Package for Model Building and Variable Selection in the GAMLSS Framework

    Directory of Open Access Journals (Sweden)

    Benjamin Hofner

    2016-10-01

    Full Text Available Generalized additive models for location, scale and shape are a flexible class of regression models that allow to model multiple parameters of a distribution function, such as the mean and the standard deviation, simultaneously. With the R package gamboostLSS, we provide a boosting method to fit these models. Variable selection and model choice are naturally available within this regularized regression framework. To introduce and illustrate the R package gamboostLSS and its infrastructure, we use a data set on stunted growth in India. In addition to the specification and application of the model itself, we present a variety of convenience functions, including methods for tuning parameter selection, prediction and visualization of results. The package gamboostLSS is available from the Comprehensive R Archive Network (CRAN at https://CRAN.R-project.org/package=gamboostLSS.

  14. Applicability of Simplified Simulation Models for Perforation-Mediated Modified Atmosphere Packaging of Fresh Produce

    Directory of Open Access Journals (Sweden)

    Min-Ji Kwon

    2013-01-01

    Full Text Available The comprehensive mass balances of differential equations involving gas diffusion and hydraulic convection through package perforation, gas permeation through polymeric film, and produce respiration have commonly been used to predict the atmosphere of perforated fresh produce packages. However, the predictions often suffer from instability, and to circumvent this problem, a simplified diffusion model that omits the convective gas transfer and empirical models based on experimental mass transfer data have been developed and investigated previously by several researchers. This study investigated the potential and limitations of the simplified diffusion model and two empirical models for predicting the atmosphere in perforated produce packages. The simplified diffusion model satisfactorily estimated the atmosphere inside the perforated packages of fresh produce under the aerobic conditions examined. Published empirical models of the mass transfer coefficients of the perforation seem to be valid only for the measured conditions and thus should be used carefully for that specific purpose.

  15. GlycReSoft: a software package for automated recognition of glycans from LC/MS data.

    Directory of Open Access Journals (Sweden)

    Evan Maxwell

    Full Text Available Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS is used to profile the glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.

  16. Aspects of system modelling in Hardware/Software partitioning

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1996-01-01

    This paper addresses fundamental aspects of system modelling and partitioning algorithms in the area of Hardware/Software Codesign. Three basic system models for partitioning are presented and the consequences of partitioning according to each of these are analyzed. The analysis shows the importa......This paper addresses fundamental aspects of system modelling and partitioning algorithms in the area of Hardware/Software Codesign. Three basic system models for partitioning are presented and the consequences of partitioning according to each of these are analyzed. The analysis shows...... the importance of making a clear distinction between the model used for partitioning and the model used for evaluation It also illustrates the importance of having a realistic hardware model such that hardware sharing can be taken into account. Finally, the importance of integrating scheduling and allocation...

  17. Software Piracy Detection Model Using Ant Colony Optimization Algorithm

    Science.gov (United States)

    Astiqah Omar, Nor; Zakuan, Zeti Zuryani Mohd; Saian, Rizauddin

    2017-06-01

    Internet enables information to be accessible anytime and anywhere. This scenario creates an environment whereby information can be easily copied. Easy access to the internet is one of the factors which contribute towards piracy in Malaysia as well as the rest of the world. According to a survey conducted by Compliance Gap BSA Global Software Survey in 2013 on software piracy, found out that 43 percent of the software installed on PCs around the world was not properly licensed, the commercial value of the unlicensed installations worldwide was reported to be 62.7 billion. Piracy can happen anywhere including universities. Malaysia as well as other countries in the world is faced with issues of piracy committed by the students in universities. Piracy in universities concern about acts of stealing intellectual property. It can be in the form of software piracy, music piracy, movies piracy and piracy of intellectual materials such as books, articles and journals. This scenario affected the owner of intellectual property as their property is in jeopardy. This study has developed a classification model for detecting software piracy. The model was developed using a swarm intelligence algorithm called the Ant Colony Optimization algorithm. The data for training was collected by a study conducted in Universiti Teknologi MARA (Perlis). Experimental results show that the model detection accuracy rate is better as compared to J48 algorithm.

  18. Modelling and simulating the transitory regimes in NPP using the MMS package programs

    International Nuclear Information System (INIS)

    Prisecaru, I.; Dupleac, Daniel; Constantinescu, Adrian Cornel

    2003-01-01

    This paper introduces a brief presentation of the preoccupation of modelling and simulating group at the Nuclear Power Plant Department of the Faculty of Power Plant Engineering in 'Politehnica' University of Bucharest in using the Modular Modeling System, MMS, package programs for the simulation of NPP transitory regimes. Nuclear power plants are large, non-linear systems with numerous interactions between its components. In the analysis of such complex systems, dynamic simulation is recognized as a powerful method of keeping track of the myriad of interactions. The MMS is a simulation tool that has built in models for plant components using a modular approach to dynamic simulation. The MMS software modules were developed to correspond to plant components that are familiar to power plant engineers. The interface specifications of the modules were defined so that the modules can be interconnected analogously to components in the actual plant. For some components, several modules of differing complexity are available. These alternative modules allow the user to choose the module appropriate to his application, i.e. a detailed model or a more economical model with less detail. The modular nature of the MMS allows the user to tailor the goal of his simulation to the complexity of the application and allows the user to develop independent subsystems that can be integrated into a larger simulation. The MMS module library contains modules for components for fossil and nuclear power plants. Each module is a mathematical model of a type of plant component formulated from first principles. The MMS uses a simulation language that provides features to simplify the development of simulations. Features important to the development of the MMS are the macro capability, automatic sorting of modeling equations, and integration algorithms. The macro capability is used to express the modeling equations for an MMS module. Since modules may be used more than once in the same simulation

  19. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  20. Salvus: A flexible open-source package for waveform modelling and inversion from laboratory to global scales

    Science.gov (United States)

    Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; May, D.; Rietmann, M.; Fichtner, A.

    2016-12-01

    Recent years have been witness to the application of waveform inversion to new and exciting domains, ranging from non-destructive testing to global seismology. Often, each new application brings with it novel wave propagation physics, spatial and temporal discretizations, and models of variable complexity. Adapting existing software to these novel applications often requires a significant investment of time, and acts as a barrier to progress. To combat these problems we introduce Salvus, a software package designed to solve large-scale full-waveform inverse problems, with a focus on both flexibility and performance. Based on a high order finite (spectral) element discretization, we have built Salvus to work on unstructured quad/hex meshes in both 2 or 3 dimensions, with support for P1-P3 bases on triangles and tetrahedra. A diverse (and expanding) collection of wave propagation physics are supported (i.e. coupled solid-fluid). With a focus on the inverse problem, functionality is provided to ease integration with internal and external optimization libraries. Additionally, a python-based meshing package is included to simplify the generation and manipulation of regional to global scale Earth models (quad/hex), with interfaces available to external mesh generators for complex engineering-scale applications (quad/hex/tri/tet). Finally, to ensure that the code remains accurate and maintainable, we build upon software libraries such as PETSc and Eigen, and follow modern software design and testing protocols. Salvus bridges the gap between research and production codes with a design based on C++ mixins and Python wrappers that separates the physical equations from the numerical core. This allows domain scientists to add new equations using a high-level interface, without having to worry about optimized implementation details. Our goal in this presentation is to introduce the code, show several examples across the scales, and discuss some of the extensible design points.

  1. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    Science.gov (United States)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  2. Seamless Method- and Model-based Software and Systems Engineering

    Science.gov (United States)

    Broy, Manfred

    Today engineering software intensive systems is still more or less handicraft or at most at the level of manufacturing. Many steps are done ad-hoc and not in a fully systematic way. Applied methods, if any, are not scientifically justified, not justified by empirical data and as a result carrying out large software projects still is an adventure. However, there is no reason why the development of software intensive systems cannot be done in the future with the same precision and scientific rigor as in established engineering disciplines. To do that, however, a number of scientific and engineering challenges have to be mastered. The first one aims at a deep understanding of the essentials of carrying out such projects, which includes appropriate models and effective management methods. What is needed is a portfolio of models and methods coming together with a comprehensive support by tools as well as deep insights into the obstacles of developing software intensive systems and a portfolio of established and proven techniques and methods with clear profiles and rules that indicate when which method is ready for application. In the following we argue that there is scientific evidence and enough research results so far to be confident that solid engineering of software intensive systems can be achieved in the future. However, yet quite a number of scientific research problems have to be solved.

  3. A bunswik lens model of consumer health judgments of packaged foods

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund

    2014-01-01

    criterion was calculated for each product according to its specific nutrition values using a validated nutrition profile. The lens model included explicit cues such as nutrition values, nutrition and health claims, food category, and brand and implicit cues such a packaging design and category......Consumer health judgments of packaged food were compared with an objective healthfulness criterion using a Brunswik lens model. Consumer judgments were obtained from a representative consumer sample (N= 1329) who evaluated the healthfulness of 198 packaged food products. The objective healthfulness...

  4. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    NARCIS (Netherlands)

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  5. truncSP: An R Package for Estimation of Semi-Parametric Truncated Linear Regression Models

    Directory of Open Access Journals (Sweden)

    Maria Karlsson

    2014-05-01

    Full Text Available Problems with truncated data occur in many areas, complicating estimation and inference. Regarding linear regression models, the ordinary least squares estimator is inconsistent and biased for these types of data and is therefore unsuitable for use. Alternative estimators, designed for the estimation of truncated regression models, have been developed. This paper presents the R package truncSP. The package contains functions for the estimation of semi-parametric truncated linear regression models using three different estimators: the symmetrically trimmed least squares, quadratic mode, and left truncated estimators, all of which have been shown to have good asymptotic and ?nite sample properties. The package also provides functions for the analysis of the estimated models. Data from the environmental sciences are used to illustrate the functions in the package.

  6. Implementation of a Unified Constitutive Model into the ABAQUS Finite Element Package

    National Research Council Canada - National Science Library

    Wescott, R

    1999-01-01

    Unified constitutive models have previously been developed at AMRL and implemented into the PAFEC and ABAQUS Finite Element packages to predict the stress-strain response of structures that undergo...

  7. A Business Maturity Model of Software Product Line Engineering

    OpenAIRE

    Ahmed, Faheem; Capretz, Luiz Fernando

    2015-01-01

    In the recent past, software product line engineering has become one of the most promising practices in software industry with the potential to substantially increase the software development productivity. Software product line engineering approach spans the dimensions of business, architecture, software engineering process and organization. The increasing popularity of software product line engineering in the software industry necessitates a process maturity evaluation methodology. According...

  8. Advanced quality prediction model for software architectural knowledge sharing

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris; Tang, Antony; Xu, Lai

    In the field of software architecture, a paradigm shift is occurring from describing the outcome of architecting process to describing the Architectural Knowledge (AK) created and used during architecting. Many AK models have been defined to represent domain concepts and their relationships, and

  9. Advances in Games Technology: Software, Models, and Intelligence

    Science.gov (United States)

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  10. Software-engineering-based model for mitigating Repetitive Strain ...

    African Journals Online (AJOL)

    The incorporation of Information and Communication Technology (ICT) in virtually all facets of human endeavours has fostered the use of computers. This has induced Repetitive Stress Injury (RSI) for continuous and persistent computer users. Proposing a software engineering model capable of enacted RSI force break ...

  11. Software Design Modelling with Functional Petri Nets | Bakpo ...

    African Journals Online (AJOL)

    In this paper, an equivalent functional Petri Net (FPN) model is developed for each of the three constructs of structured programs and a FPN Software prototype proposed for the conventional programming construct: if-then-else statement. The motivating idea is essentially to show that FPNs could be used as an alternative ...

  12. An Evaluation of ADLs on Modeling Patterns for Software Architecture

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2007-01-01

    Architecture patterns provide solutions to recurring design problems at the architecture level. In order to model patterns during software architecture design, one may use a number of existing Architecture Description Languages (ADLs), including the UML, a generic language but also a de facto

  13. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    Science.gov (United States)

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  14. GW-SEM: A Statistical Package to Conduct Genome-Wide Structural Equation Modeling.

    Science.gov (United States)

    Verhulst, Brad; Maes, Hermine H; Neale, Michael C

    2017-05-01

    Improving the accuracy of phenotyping through the use of advanced psychometric tools will increase the power to find significant associations with genetic variants and expand the range of possible hypotheses that can be tested on a genome-wide scale. Multivariate methods, such as structural equation modeling (SEM), are valuable in the phenotypic analysis of psychiatric and substance use phenotypes, but these methods have not been integrated into standard genome-wide association analyses because fitting a SEM at each single nucleotide polymorphism (SNP) along the genome was hitherto considered to be too computationally demanding. By developing a method that can efficiently fit SEMs, it is possible to expand the set of models that can be tested. This is particularly necessary in psychiatric and behavioral genetics, where the statistical methods are often handicapped by phenotypes with large components of stochastic variance. Due to the enormous amount of data that genome-wide scans produce, the statistical methods used to analyze the data are relatively elementary and do not directly correspond with the rich theoretical development, and lack the potential to test more complex hypotheses about the measurement of, and interaction between, comorbid traits. In this paper, we present a method to test the association of a SNP with multiple phenotypes or a latent construct on a genome-wide basis using a diagonally weighted least squares (DWLS) estimator for four common SEMs: a one-factor model, a one-factor residuals model, a two-factor model, and a latent growth model. We demonstrate that the DWLS parameters and p-values strongly correspond with the more traditional full information maximum likelihood parameters and p-values. We also present the timing of simulations and power analyses and a comparison with and existing multivariate GWAS software package.

  15. A CASE STUDY OF MODELING A TORUS IN DIFFERENT MODELING SOFTWARES

    Directory of Open Access Journals (Sweden)

    VELJKOVIĆ Milica

    2017-05-01

    Full Text Available Modeling of the complex geometric shapes requires the use of appropriate softwares. This study analyzes the process of modeling with two different computer softwares, AutoCAD and Rhinoceros. The aim is to demonstrate the similarities and differences between these softwares when used for modeling torus, a double curved geometric surface. The two modeling processes are compared in order to investigate the potentials of these softwares in the modeling of an architectural structure comprising a shell of the torus. After a detailed comparative analysis, the essential characteristics and shortcomings of these programs are emphasized and they were used to recommend the more appropriate one.

  16. Modeling the geographical studies with GeoGebra-software

    Directory of Open Access Journals (Sweden)

    Ionica Soare

    2010-01-01

    Full Text Available The problem of mathematical modeling in geography is one of the most important strategies in order to establish the evolution and the prevision of geographical phenomena. Models must have a simplified structure, to reflect essential components and must be selective, structured, and suggestive and approximate the reality. Models could be static or dynamic, developed in a theoretical, symbolic, conceptual or mental way, mathematically modeled. The present paper is focused on the virtual model which uses GeoGebra software, free and available at www.geogebra.org, in order to establish new methods of geographical analysis in a dynamic, didactic way.

  17. Modeling and managing risk early in software development

    Science.gov (United States)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  18. Effects of deceptive packaging and product involvement on purchase intention: an elaboration likelihood model perspective.

    Science.gov (United States)

    Lammers, H B

    2000-04-01

    From an Elaboration Likelihood Model perspective, it was hypothesized that postexposure awareness of deceptive packaging claims would have a greater negative effect on scores for purchase intention by consumers lowly involved rather than highly involved with a product (n = 40). Undergraduates who were classified as either highly or lowly (ns = 20 and 20) involved with M&Ms examined either a deceptive or non-deceptive package design for M&Ms candy and were subsequently informed of the deception employed in the packaging before finally rating their intention to purchase. As anticipated, highly deceived subjects who were low in involvement rated intention to purchase lower than their highly involved peers. Overall, the results attest to the robustness of the model and suggest that the model has implications beyond advertising effects and into packaging effects.

  19. A software for parameter estimation in dynamic models

    Directory of Open Access Journals (Sweden)

    M. Yuceer

    2008-12-01

    Full Text Available A common problem in dynamic systems is to determine parameters in an equation used to represent experimental data. The goal is to determine the values of model parameters that provide the best fit to measured data, generally based on some type of least squares or maximum likelihood criterion. In the most general case, this requires the solution of a nonlinear and frequently non-convex optimization problem. Some of the available software lack in generality, while others do not provide ease of use. A user-interactive parameter estimation software was needed for identifying kinetic parameters. In this work we developed an integration based optimization approach to provide a solution to such problems. For easy implementation of the technique, a parameter estimation software (PARES has been developed in MATLAB environment. When tested with extensive example problems from literature, the suggested approach is proven to provide good agreement between predicted and observed data within relatively less computing time and iterations.

  20. Evolutionary neural network modeling for software cumulative failure time prediction

    International Nuclear Information System (INIS)

    Tian Liang; Noore, Afzel

    2005-01-01

    An evolutionary neural network modeling approach for software cumulative failure time prediction based on multiple-delayed-input single-output architecture is proposed. Genetic algorithm is used to globally optimize the number of the delayed input neurons and the number of neurons in the hidden layer of the neural network architecture. Modification of Levenberg-Marquardt algorithm with Bayesian regularization is used to improve the ability to predict software cumulative failure time. The performance of our proposed approach has been compared using real-time control and flight dynamic application data sets. Numerical results show that both the goodness-of-fit and the next-step-predictability of our proposed approach have greater accuracy in predicting software cumulative failure time compared to existing approaches