WorldWideScience

Sample records for modeling software package

  1. dMODELS: A software package for modeling volcanic deformation

    Science.gov (United States)

    Battaglia, Maurizio

    2017-04-01

    dMODELS is a software package that includes the most common source models used to interpret deformation measurements near active volcanic centers. The emphasis is on estimating the parameters of analytical models of deformation by inverting data from the Global Positioning System (GPS), Interferometric Synthetic Aperture Radar (InSAR), tiltmeters and strainmeters. Source models include: (a) pressurized spherical, ellipsoidal and sill-like magma chambers in an elastic, homogeneous, flat half-space; (b) pressurized spherical magma chambers with topography corrections; and (c) the solutions for a dislocation (fracture) in an elastic, homogeneous, flat half-space. All of the equations have been extended to include deformation and strain within the Earth's crust (as opposed to only at the Earth's surface) and verified against finite element models. Although actual volcanic sources are not embedded cavities of simple shape, we assume that these models may reproduce the stress field created by the actual magma intrusion or hydrothermal fluid injection. The dMODELS software employs a nonlinear inversion algorithm to determine the best-fit parameters for the deformation source by searching for the minimum of the cost function χv2 (chi square per degrees of freedom). The non-linear inversion algorithm is a combination of local optimization (interior-point method) and random search. This approach is more efficient for hyper-parameter optimization than trials on a grid. The software has been developed using MATLAB, but compiled versions that can be run using the free MATLAB Compiler Runtime (MCR) module are available for Windows 64-bit operating systems. The MATLAB scripts and compiled files are open source and intended for teaching and research. The software package includes both functions for forward modeling and scripts for data inversion. A software demonstration will be available during the meeting. You are welcome to contact the author at mbattaglia@usgs.gov for

  2. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  3. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  4. scoringRules - A software package for probabilistic model evaluation

    Science.gov (United States)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  5. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  6. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    Science.gov (United States)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  7. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  8. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. DBSolve Optimum: a software package for kinetic modeling which allows dynamic visualization of simulation results

    Directory of Open Access Journals (Sweden)

    Gizzatkulov Nail M

    2010-08-01

    Full Text Available Abstract Background Systems biology research and applications require creation, validation, extensive usage of mathematical models and visualization of simulation results by end-users. Our goal is to develop novel method for visualization of simulation results and implement it in simulation software package equipped with the sophisticated mathematical and computational techniques for model development, verification and parameter fitting. Results We present mathematical simulation workbench DBSolve Optimum which is significantly improved and extended successor of well known simulation software DBSolve5. Concept of "dynamic visualization" of simulation results has been developed and implemented in DBSolve Optimum. In framework of the concept graphical objects representing metabolite concentrations and reactions change their volume and shape in accordance to simulation results. This technique is applied to visualize both kinetic response of the model and dependence of its steady state on parameter. The use of the dynamic visualization is illustrated with kinetic model of the Krebs cycle. Conclusion DBSolve Optimum is a user friendly simulation software package that enables to simplify the construction, verification, analysis and visualization of kinetic models. Dynamic visualization tool implemented in the software allows user to animate simulation results and, thereby, present them in more comprehensible mode. DBSolve Optimum and built-in dynamic visualization module is free for both academic and commercial use. It can be downloaded directly from http://www.insysbio.ru.

  10. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    Science.gov (United States)

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  11. Software package requirements and procurement

    OpenAIRE

    1996-01-01

    This paper outlines the problems of specifying requirements and deploying these requirements in the procurement of software packages. Despite the fact that software construction de novo is the exception rather than the rule, little or no support for the task of formulating requirements to support assessment and selection among existing software packages has been developed. We analyse the problems arising in this process and review related work. We outline the key components of a programme of ...

  12. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Science.gov (United States)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  13. Implementing a Simulation Study Using Multiple Software Packages for Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Sunbok Lee

    2015-07-01

    Full Text Available A Monte Carlo simulation study is an essential tool for evaluating the behavior of various quantitative methods including structural equation modeling (SEM under various conditions. Typically, a large number of replications are recommended for a Monte Carlo simulation study, and therefore automating a Monte Carlo simulation study is important to get the desired number of replications for a simulation study. This article is intended to provide concrete examples for automating a Monte Carlo simulation study using some standard software packages for SEM: Mplus, LISREL, SAS PROC CALIS, and R package lavaan. Also, the equivalence between the multilevel SEM and hierarchical linear modeling (HLM is discussed, and relevant examples are provided. It is hoped that the codes in this article can provide some building blocks for researchers to write their own code to automate simulation procedures.

  14. An Ada Linear-Algebra Software Package Modeled After HAL/S

    Science.gov (United States)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  15. An Ada Linear-Algebra Software Package Modeled After HAL/S

    Science.gov (United States)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  16. TASC Graphics Software Package.

    Science.gov (United States)

    1982-12-01

    RD-I55 861 TSC GRPHICS SOFTWRE PCKRGE(U) NLYTIC SCIENCES i/I RD 𔄀-t CORP RERDING MA M R TANG DEC 82 TR-1946-6U~~cLss AFG L-TR-gi-1388 Fi9629-89-C...extensions were made to allow TGSP to use color graphics. 2.1 INTERACTIVE TGSP NCAR was designed to be a general plot package for use with many different...plotting devices. It is designed to accept high level commands and generate an intermediate set of commands called metacode and to then use device

  17. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Science.gov (United States)

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  18. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Directory of Open Access Journals (Sweden)

    Jeffrey K Noel

    2016-03-01

    Full Text Available Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  19. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    Science.gov (United States)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  20. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  1. Development of the CCP-200 mathematical model for Syzran CHPP using the Thermolib software package

    Science.gov (United States)

    Usov, S. V.; Kudinov, A. A.

    2016-04-01

    Simplified cycle diagram of the CCP-200 power generating unit of Syzran CHPP containing two gas turbines PG6111FA with generators, two steam recovery boilers KUP-110/15-8.0/0.7-540/200, and one steam turbine Siemens SST-600 (one-cylinder with two variable heat extraction units of 60/75 MW in heatextraction and condensing modes, accordingly) with S-GEN5-100 generators was presented. Results of experimental guarantee tests of the CCP-200 steam-gas unit are given. Brief description of the Thermolib application for the MatLab Simulink software package is given. Basic equations used in Thermolib for modeling thermo-technical processes are given. Mathematical models of gas-turbine plant, heat-recovery steam generator, steam turbine and integrated plant for power generating unit CCP-200 of Syzran CHPP were developed with the help of MatLab Simulink and Thermolib. The simulation technique at different ambient temperature values was used in order to get characteristics of the developed mathematical model. Graphic comparison of some characteristics of the CCP-200 simulation model (gas temperature behind gas turbine, gas turbine and combined cycle plant capacity, high and low pressure steam consumption and feed water consumption for high and low pressure economizers) with actual characteristics of the steam-gas unit received at experimental (field) guarantee tests at different ambient temperature are shown. It is shown that the chosen degrees of complexity, characteristics of the CCP-200 simulation model, developed by Thermolib, adequately correspond to the actual characteristics of the steam-gas unit received at experimental (field) guarantee tests; this allows considering the developed mathematical model as adequate and acceptable it for further work.

  2. UCVM: An Open Source Software Package for Querying and Visualizing 3D Velocity Models

    Science.gov (United States)

    Gill, D.; Small, P.; Maechling, P. J.; Jordan, T. H.; Shaw, J. H.; Plesch, A.; Chen, P.; Lee, E. J.; Taborda, R.; Olsen, K. B.; Callaghan, S.

    2015-12-01

    Three-dimensional (3D) seismic velocity models provide foundational data for ground motion simulations that calculate the propagation of earthquake waves through the Earth. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) package for both Linux and OS X. This unique framework provides a cohesive way for querying and visualizing 3D models. UCVM v14.3.0, supports many Southern California velocity models including CVM-S4, CVM-H 11.9.1, and CVM-S4.26. The last model was derived from 26 full-3D tomographic iterations on CVM-S4. Recently, UCVM has been used to deliver a prototype of a new 3D model of central California (CCA) also based on full-3D tomographic inversions. UCVM was used to provide initial plots of this model and will be used to deliver CCA to users when the model is publicly released. Visualizing models is also possible with UCVM. Integrated within the platform are plotting utilities that can generate 2D cross-sections, horizontal slices, and basin depth maps. UCVM can also export models in NetCDF format for easy import into IDV and ParaView. UCVM has also been prototyped to export models that are compatible with IRIS' new Earth Model Collaboration (EMC) visualization utility. This capability allows for user-specified horizontal slices and cross-sections to be plotted in the same 3D Earth space. UCVM was designed to help a wide variety of researchers. It is currently being use to generate velocity meshes for many SCEC wave propagation codes, including AWP-ODC-SGT and Hercules. It is also used to provide the initial input to SCEC's CyberShake platform. For those interested in specific data points, the software framework makes it easy to extract P and S wave propagation speeds and other material properties from 3D velocity models by providing a common interface through which researchers can query earth models for a given location and depth. Also included in the last release was the ability to add small

  3. Display system software for the integration of an ADAGE 3000 programmable display generator into the solid modeling package C.A.D. software

    Science.gov (United States)

    Montoya, R. J.; Lane, H. H., Jr.

    1986-01-01

    A software system that integrates an ADAGE 3000 Programmable Display Generator into a C.A.D. software package known as the Solid Modeling Program is described. The Solid Modeling Program (SMP) is an interactive program that is used to model complex solid object through the composition of primitive geomeentities. In addition, SMP provides extensive facilities for model editing and display. The ADAGE 3000 Programmable Display Generator (PDG) is a color, raster scan, programmable display generator with a 32-bit bit-slice, bipolar microprocessor (BPS). The modularity of the system architecture and the width and speed of the system bus allow for additional co-processors in the system. These co-processors combine to provide efficient operations on and rendering of graphics entities. The resulting software system takes advantage of the graphics capabilities of the PDG in the operation of SMP by distributing its processing modules between the host and the PDG. Initially, the target host computer was a PRIME 850, which was later substituted with a VAX-11/785. Two versions of the software system were developed, a phase 1 and a phase 2. In phase 1, the ADAGE 3000 is used as a frame buffer. In phase II, SMP was functionally partitioned and some of its functions were implemented in the ADAGE 3000 by means of ADAGE's SOLID 3000 software package.

  4. The last developments of the airGR R-package, an open source software for rainfall-runoff modelling

    Science.gov (United States)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Perrin, Charles; Andréassian, Vazken

    2017-04-01

    and usability of this tool. References Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  5. An Overview on Wavelet Software Packages

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Wavelet analysis provides very powerful problem-solving tools foranalyzing, en coding, compressing, reconstructing, and modeling signals and images. The amount of wavelets-related software has been constantly multiplying. Many wavelet ana lysis tools are widely available. This overview represents a significant survey for many currently available packages. It will be of great benefit to engineers and researchers for using the toolkits and developing new software. The beginner to learning wavelets can also get a great help from the review. If you browse a round at some of the Internet sites listed in the reference of this paper, you m ay find more plentiful wavelet resources.

  6. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  7. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  8. Software Package \\Nesvetay-3D" for modeling three-dimensional flows of monatomic rarefied gas

    Directory of Open Access Journals (Sweden)

    V. A. Titarev

    2014-01-01

    Full Text Available Analysis of three-dimensional rarefied gas flowsin microdevices (micropipes, micropumps etc and over re-entry vehicles requires development of methods of computational modelling. One of such methods is the direct numerical solution of the Boltzmann kinetic equation for the velocity distribution function with either exact or approximate (model collision integral. At present, for flows of monatomic rarefied gas the Shakhov model kinetic equation, also called S-model, has gained wide-spread use. The equation can be regarded as a model equation of the incomplete thirdorder approximation. Despite its relative simplicity, the S-model is still a complicated integrodifferential equation of high dimension. The numerical solution of such an equation requires high-accuracy parallel methods.The present work is a review of recent results concerning the development and application of three-dimensional computer package Nesvetay-3D intended for modelling of rarefied gas flows. The package solves Boltzmann kinetic equation with the BGK (Krook and Shakhov model collision integrals using the discrete velocity approach. Calculations are carried out in non-dimensional variables. A finite integration domain and a mesh are introduced in the molecular velocity space. Next, the kinetic equation is re-written as a system of kinetic equations for each of the discrete velocities. The system is solved using an implicit finite-volume method of Godunov type. The steady-state solution is computed by a time marching method. High order of spatial accuracy is achieved by using a piece-wise linear representation of the distribution function in each spatial cell. In general, the coefficients of such an approximation are found using the least-square method. Arbitrary unstructured meshes in the physical space can be used in calculations, which allow considering flows over objects of general geometrical shape. Conservative property of the method with respect to the model collision

  9. Software packages for food engineering needs

    OpenAIRE

    Abakarov, Alik

    2011-01-01

    The graphic user interface (GUI) software packages “ANNEKs” and “OPT-PROx” are developed to meet food engineering needs. “OPT-RROx” (OPTimal PROfile) is software developed to carry out thermal food processing optimization based on the variable retort temperature processing and global optimization technique. “ANNEKs” (Artificial Neural Network Enzyme Kinetics) is software designed for determining the kinetics of enzyme hydrolysis of protein at different initial reaction parameters based on the...

  10. FITSH -- a software package for image processing

    CERN Document Server

    Pál, András

    2011-01-01

    In this paper we describe the main features of the software package named FITSH, intended to provide a standalone environment for analysis of data acquired by imaging astronomical detectors. The package provides utilities both for the full pipeline of subsequent related data processing steps (incl. image calibration, astrometry, source identification, photometry, differential analysis, low-level arithmetic operations, multiple image combinations, spatial transformations and interpolations, etc.) and for aiding the interpretation of the (mainly photometric and/or astrometric) results. The package also features a consistent implementation of photometry based on image subtraction, point spread function fitting and aperture photometry and provides easy-to-use interfaces for comparisons and for picking the most suitable method for a particular problem. This set of utilities found in the package are built on the top of the commonly used UNIX/POSIX shells (hence the name of the package), therefore both frequently us...

  11. Software Package for Bio-Signal Analysis

    Science.gov (United States)

    2002-10-15

    We have developed a MatlabTM based software package for bio -signal analysis. The software is based on modular design and can thus be easily adapted...to fit on analysis of various kind of time variant or event-related bio -signals. Currently analysis programs for event-related potentials (ERP) heart...rate variability (HRV), galvanic skin responses (GSR) and quantitative EEG (qEEG) are implemented. A tool for time varying spectral analysis of bio

  12. The ASTROID Simulator Software Package: Realistic Modelling of High-Precision High-Cadence Space-Based Imaging

    CERN Document Server

    Marcos-Arenal, P; De Ridder, J; Huygen, R; Aerts, C

    2014-01-01

    The preparation of a space-mission that carries out any kind of imaging to detect high-precision low-amplitude variability of its targets requires a robust model for the expected performance of its instruments. This model cannot be derived from simple addition of noise properties due to the complex interaction between the various noise sources. While it is not feasible to build and test a prototype of the imaging device on-ground, realistic numerical simulations in the form of an end-to-end simulator can be used to model the noise propagation in the observations. These simulations not only allow studying the performance of the instrument, its noise source response and its data quality, but also the instrument design verification for different types of configurations, the observing strategy and the scientific feasibility of an observing proposal. In this way, a complete description and assessment of the objectives to expect from the mission can be derived. We present a high-precision simulation software packag...

  13. Software Package STATISTICA and Educational Process

    Directory of Open Access Journals (Sweden)

    Demidova Liliya

    2016-01-01

    Full Text Available The paper describes the main aspects of application of the software package STATISTICA in the educational process. Technologies of data mining which can be useful for students researches have been considered. The main tools of these technologies have been discussed.

  14. RPC Stereo Processor (rsp) - a Software Package for Digital Surface Model and Orthophoto Generation from Satellite Stereo Imagery

    Science.gov (United States)

    Qin, R.

    2016-06-01

    Large-scale Digital Surface Models (DSM) are very useful for many geoscience and urban applications. Recently developed dense image matching methods have popularized the use of image-based very high resolution DSM. Many commercial/public tools that implement matching methods are available for perspective images, but there are rare handy tools for satellite stereo images. In this paper, a software package, RPC (rational polynomial coefficient) stereo processor (RSP), is introduced for this purpose. RSP implements a full pipeline of DSM and orthophoto generation based on RPC modelled satellite imagery (level 1+), including level 2 rectification, geo-referencing, point cloud generation, pan-sharpen, DSM resampling and ortho-rectification. A modified hierarchical semi-global matching method is used as the current matching strategy. Due to its high memory efficiency and optimized implementation, RSP can be used in normal PC to produce large format DSM and orthophotos. This tool was developed for internal use, and may be acquired by researchers for academic and non-commercial purpose to promote the 3D remote sensing applications.

  15. Tracked Vehicle Dynamics Modeling and Simulation Methodology, with Control, using RecurDyn Software Package

    Science.gov (United States)

    2011-09-01

    track segment, with pins connecting each track segment. The modeler must align each segment properly with the track pins with the sprocket teeth and...representative track segment is copied and linked together using a simplified algorithm which assumes each track segment is identical, with force/ torque pairs...simulation, RecurDyn feeds CoLink the desired inputs (error term, speed, direction, etc), CoLink performs the programmed operation (generates torque

  16. Package Coupling Measurement in Object-Oriented Software

    Institute of Scientific and Technical Information of China (English)

    Varun Gupta; Jitender Kumar Chhabra

    2009-01-01

    The grouping of correlated classes into a package helps in better organization of modern object-oriented software. The quality of such packages needs to be measured so as to estimate their utilization. In this paper, new package coupling metrics are proposed, which also take into consideration the hierarchical structure of packages and direction of connections among package elements. The proposed measures have been validated theoretically as well as empirically using 18 packages taken from two open source software systems. The results obtained from this study show strong correlation between package coupling and understandability of the package which suggests that proposed metrics could be further used to represent other external software quality factors.

  17. An Integrated Software Package to Enable Predictive Simulation Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang; Palmer, Bruce J.; Sharma, Poorva; Huang, Zhenyu

    2016-08-11

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package, as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.

  18. ASPT software source code: ASPT signal excision software package

    Science.gov (United States)

    Parliament, Hugh

    1992-08-01

    The source code for the ASPT Signal Excision Software Package which is part of the Adaptive Signal Processing Testbed (ASPT) is presented. The source code covers the programs 'excision', 'ab.out', 'd0.out', 'bd1.out', 'develop', 'despread', 'sorting', and 'convert'. These programs are concerned with collecting data, filtering out interference from a spread spectrum signal, analyzing the results, and developing and testing new filtering algorithms.

  19. Information technologies and software packages for education of specialists in materials science [In Russian

    NARCIS (Netherlands)

    V. Krzhizhanovskaya; S. Ryaboshuk

    2009-01-01

    This paper presents methodological materials, interactive text-books and software packages developed and extensively used for education of specialists in materials science. These virtual laboratories for education and research are equipped with tutorials and software environment for modeling complex

  20. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Energy Technology Data Exchange (ETDEWEB)

    Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Yarba, Julia [Fermilab; Kelsey, Michael [SLAC; Wright, Dennis H. [SLAC

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  1. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  2. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    Science.gov (United States)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  3. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  4. International Inventory of Software Packages in the Information Field.

    Science.gov (United States)

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  5. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model.

    Energy Technology Data Exchange (ETDEWEB)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-03-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  6. Software interface and data acquisition package for the LakeShore cryogenics vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    O`Dell, B.H.

    1995-11-01

    A software package was developed to replace the software provided by LakeShore for their model 7300 vibrating sample magnetometer (VSM). Several problems with the original software`s functionality caused this group to seek a new software package. The new software utilizes many features that were unsupported in the LakeShore software, including a more functional step mode, point averaging mode, vector moment measurements, and calibration for field offset. The developed software interfaces the VSM through a menu driven graphical user interface, and bypasses the VSM`s on board processor leaving control of the VSM up to the software. The source code for this software is readily available to any one. By having the source, the experimentalist has full control of data acquisition and can add routines specific to their experiment.

  7. ADAPTIVE ASYNCHRONOUS SIMULATED AND METRICAL SOFTWARE PACKAGE IN AIRCRAFT

    Directory of Open Access Journals (Sweden)

    Mr. Vladimir A. Tikhomirov

    2016-09-01

    Full Text Available The article is devoted to the peculiarities and problems of development of applied measuring software packages at the stage of mass production for aircraft avionics testing and measuring.

  8. The Guelph PIXE software package IV

    Science.gov (United States)

    Campbell, J. L.; Boyd, N. I.; Grassi, N.; Bonnick, P.; Maxwell, J. A.

    2010-10-01

    Following the introduction of GUPIXWIN in 2005, a number of upgrades have been made in the interests of extending the applicability of the program. Extension of the proton upper energy limit to 5 MeV facilitates the simultaneous use of PIXE with other ion beam analysis techniques. Also, the increased penetration depth enables the complete PIXE analysis of paintings. A second database change is effected in which recently recommended values of L-subshell fluorescence and Coster-Kronig yields are adopted. A Monte Carlo code has been incorporated in the GUPIX package to provide detector efficiency values that are more accurate than those of the previous approximate analytical formula. Silicon escape peak modeling is extended to the back face of silicon drift detectors. An improved description of the attenuation in dura-coated beryllium detector windows is devised. Film thickness determination is enhanced. A new batch mode facility is designed to handle two-detector PIXE, with one detector measuring major elements and the other simultaneously measuring trace elements.

  9. SEDA: A software package for the Statistical Earthquake Data Analysis.

    Science.gov (United States)

    Lombardi, A M

    2017-03-14

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  10. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  11. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-01-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package. PMID:28290482

  12. Development of an engine system simulation software package - ESIM

    Energy Technology Data Exchange (ETDEWEB)

    Erlandsson, Olof

    2000-10-01

    A software package, ESIM is developed for simulating internal combustion engine systems, including models for engine, manifolds, turbocharger, charge-air cooler (inter cooler) and inlet air heater. This study focus on the thermodynamic treatment and methods used in the models. It also includes some examples of system simulations made with these models for validation purposes. The engine model can be classified as a zero-dimensional, single zone model. It includes calculation of the valve flow process, models for heat release and models for in-cylinder, exhaust port and manifold heat transfer. Models are developed for handling turbocharger performance and charge air cooler characteristics. The main purpose of the project related to this work is to use the ESIM software to study heat balance and performance of homogeneous charge compression ignition (HCCI) engine systems. A short description of the HCCI engine is therefore included, pointing out the difficulties, or challenges regarding the HCCI engine, from a system perspective. However, the relations given here, and the code itself, is quite general, making it possible to use these models to simulate spark ignited, as well as direct injected engines.

  13. Library Automation Software Packages used in Academic Libraries of Nepal

    OpenAIRE

    Sharma (Baral), Sabitri

    2007-01-01

    This thesis presents a comparative assessment of the library automation software packages used in Nepalese academic libraries. It focuses on the evaluation of software on the basis of certain important checkpoints. It also highlights the importance of library automation, library activities and services.

  14. Quantification of myocardial perfusion defects using three different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Annmarie; Aakesson, Liz [Department of Clinical Physiology, Malmoe University Hospital, 205 02, Malmoe (Sweden); Edenbrandt, Lars [Department of Clinical Physiology, Malmoe University Hospital, 205 02, Malmoe (Sweden); Department of Clinical Physiology, Sahlgrenska University Hospital, Gothenburg (Sweden)

    2004-02-01

    Software packages are widely used for quantification of myocardial perfusion defects. The quantification is used to assist the physician in his/her interpretation of the study. The purpose of this study was to compare the quantification of reversible perfusion defects by three different commercially available software packages. We included 50 consecutive patients who underwent myocardial perfusion single-photon emission tomography (SPET) with a 2-day technetium-99m tetrofosmin protocol. Two experienced technologists processed the studies using the following three software packages: Cedars Quantitative Perfusion SPECT, Emory Cardiac Toolbox and 4D-MSPECT. The same sets of short axis slices were used as input to all three software packages. Myocardial uptake was scored in 20 segments for both the rest and the stress studies. The summed difference score (SDS) was calculated for each patient and the SDS values were classified into: normal (<4), mildly abnormal (4-8), moderately abnormal (9-13), and severely abnormal (>13). All three software packages were in agreement that 21 patients had a normal SDS, four patients had a mildly abnormal SDS and one patient had a severely abnormal SDS. In the remaining 24 patients (48%) there was disagreement between the software packages regarding SDS classification. A difference in classification of more than one step between the highest and lowest scores, for example from normal to moderately abnormal or from mildly to severely abnormal, was found in six of these 24 patients. Widely used software packages commonly differ in their quantification of myocardial perfusion defects. The interpreting physician should be aware of these differences when using scoring systems. (orig.)

  15. Floating point software package for use on LSI-11 computers at SLAC

    Energy Technology Data Exchange (ETDEWEB)

    Hendra, R.G.

    1981-06-01

    A floating point software package has been devised to allow full use of the floating point hardware of the LSI-11 and MODEL40 computers. The procedures are written for the most part in the PL-11 language. The package may be run under the RT-11 operating system, or in RAM or EPROM as part of the KERNEL package. The current set of procedures has been run successfully in all three modes.

  16. Recent developments in the ABINIT software package

    Science.gov (United States)

    Gonze, X.; Jollet, F.; Abreu Araujo, F.; Adams, D.; Amadon, B.; Applencourt, T.; Audouze, C.; Beuken, J.-M.; Bieder, J.; Bokhanchuk, A.; Bousquet, E.; Bruneval, F.; Caliste, D.; Côté, M.; Dahm, F.; Da Pieve, F.; Delaveau, M.; Di Gennaro, M.; Dorado, B.; Espejo, C.; Geneste, G.; Genovese, L.; Gerossier, A.; Giantomassi, M.; Gillet, Y.; Hamann, D. R.; He, L.; Jomard, G.; Laflamme Janssen, J.; Le Roux, S.; Levitt, A.; Lherbier, A.; Liu, F.; Lukačević, I.; Martin, A.; Martins, C.; Oliveira, M. J. T.; Poncé, S.; Pouillon, Y.; Rangel, T.; Rignanese, G.-M.; Romero, A. H.; Rousseau, B.; Rubel, O.; Shukri, A. A.; Stankovski, M.; Torrent, M.; Van Setten, M. J.; Van Troeye, B.; Verstraete, M. J.; Waroquiers, D.; Wiktor, J.; Xu, B.; Zhou, A.; Zwanziger, J. W.

    2016-08-01

    ABINIT is a package whose main program allows one to find the total energy, charge density, electronic structure and many other properties of systems made of electrons and nuclei, (molecules and periodic solids) within Density Functional Theory (DFT), Many-Body Perturbation Theory (GW approximation and Bethe-Salpeter equation) and Dynamical Mean Field Theory (DMFT). ABINIT also allows to optimize the geometry according to the DFT forces and stresses, to perform molecular dynamics simulations using these forces, and to generate dynamical matrices, Born effective charges and dielectric tensors. The present paper aims to describe the new capabilities of ABINIT that have been developed since 2009. It covers both physical and technical developments inside the ABINIT code, as well as developments provided within the ABINIT package. The developments are described with relevant references, input variables, tests and tutorials.

  17. GPS Software Packages Deliver Positioning Solutions

    Science.gov (United States)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  18. Integrated software packages in the physical laboratory

    Science.gov (United States)

    Bok, J.; Barvík, I.; Praus, P.; Heřman, P.; Čermáková, D.

    1990-11-01

    The automation of a UV-VIS spectrometer and a single-photon counting apparatus by an IBM-AT is described. Software needed for the computer control, data acquisition and processing was developed in the ASYST environment. This enabled us to use its very good graphics, its support of I/O cards, and its other excellent properties. Also we show ways to overcome some minor shortcomings using the multilanguage programming.

  19. Development of CAD software package of intellectualized casting technology

    Institute of Scientific and Technical Information of China (English)

    HOU Hua; CHENG Jun; XU Hong

    2005-01-01

    Based on the numerical simulation of solidification, a computer aid design(CAD) software package of casting technique was developed to design the rising system intelligently. The software can calculate the size and locate the situations of the isolated melts. According to the liquid shrinkage of each isolated melts and the standard parameters of risers in the database, the riser's situation and the size can be identified intelligently as long as the riser's shape is selected. 3-D software and simulation analysis of CAST soft/computer aid engineering(CAE) software show that the design of the riser and the running system is feasible.

  20. Software Packages to Support Electrical Engineering Virtual Lab

    Directory of Open Access Journals (Sweden)

    Manuel Travassos Valdez

    2012-03-01

    Full Text Available The use of Virtual Reality Systems (VRS, as a learning aid, encourages the creation of tools that allow users/students to simulate educational environments on a computer. This article presents a way of building a VRS system with Software Packages to support Electrical Engineering Virtual Laboratories to be used in a near future in the teaching of the curriculum unit of Circuit Theory. The steps required for the construction of a project are presented in this paper. The simulation is still under construction and intends to use a three-dimensional virtual environment laboratory electric measurement, which will allow users/students to experiment and test the modeled equipment. Therefore, there are still no links available for further examination. The result may demonstrate the future potential of applications of Virtual Reality Systems as an efficient and cost-effective learning system.

  1. Comparison of four software packages applied to a scattering problem

    DEFF Research Database (Denmark)

    Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren

    1999-01-01

    We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...

  2. Entretian Model for Software Maintenance

    Directory of Open Access Journals (Sweden)

    Priya K Betala

    2013-10-01

    Full Text Available Maintenance refers to the act of modifying the software after putting in use in order to maintain its usability.[1]. In other words, Software maintenance can be defined as; it is the process of providing services to the customers after the delivery of the software. Despite the fact that maintaining software is very challenging, it is the most important routine that must be carried out in the development cycle. If the software is not maintained efficiently it may lead to the death of the software. Maintenance of software may be carried out in two ways. The first one is called „In-house maintenance‟ and the second one is called „Transition maintenance‟. The latter faces the drastic challenges when compared to former, as one team may not provide complete source code to the other, leading to unstructured code, lack of appropriate technique and knowledge about the functioning of the current software. There are a few aspects of software maintenance that set it apart from the other phases. Software maintenance cost comprises more than half of the total software development cost. Also, without software maintenance, it is impossible to change the problems within the product after its release, and many disasters can happen because of immature software. Recognising the importance of software maintenance, this paper proposes a model called “ENTRETIAN MODEL” (Entretian, a French word meaning Maintenance which consists of six basic steps to follow while maintaining software system. This model overcomes certain misconceptions about maintenance phase and it is highly beneficial to the Maintenance Support Team (MST to handle their maintenance activities systematically and efficiently. By employing the proposed model, the MST is able to overcome the technical and managerial issues that are faced earlier in the maintenance phase. The advantage of using “Entretian Model” is best illustrated in this paper with the help of the ERP package.

  3. Software package r{sup 3}t. Model for transport and retention in porous media. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Fein, E. (ed.)

    2004-07-01

    In long-termsafety analyses for final repositories for hazardous wastes in deep geological formations the impact to the biosphere due to potential release of hazardous materials is assessed for relevant scenarios. The model for migration of wastes from repositories to men is divided into three almost independent parts: the near field, the geosphere, and the biosphere. With the development of r{sup 3}t the feasibility to model the pollutant transport through the geosphere for porous or equivalent porous media in large, three-dimensional, and complex regions is established. Furthermore one has at present the ability to consider all relevant retention and interaction effects which are important for long-term safety analyses. These are equilibrium sorption, kinetically controlled sorption, diffusion into immobile pore waters, and precipitation. The processes of complexing, colloidal transport and matrix diffusion may be considered at least approximately by skilful choice of parameters. Speciation is not part of the very recently developed computer code r{sup 3}t. With r{sup 3}t it is possible to assess the potential dilution and the barrier impact of the overburden close to reality.

  4. GEMBASSY: an EMBOSS associated software package for comprehensive genome analyses

    OpenAIRE

    Itaya, Hidetoshi; Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru

    2013-01-01

    The popular European Molecular Biology Open Software Suite (EMBOSS) currently contains over 400 tools used in various bioinformatics researches, equipped with sophisticated development frameworks for interoperability and tool discoverability as well as rich documentations and various user interfaces. In order to further strengthen EMBOSS in the fields of genomics, we here present a novel EMBOSS associated software (EMBASSY) package named GEMBASSY, which adds more than 50 analysis tools from t...

  5. A Simple Interactive Software Package for Plotting, Animating, and Calculating

    Science.gov (United States)

    Engelhardt, Larry

    2012-01-01

    We introduce a new open source (free) software package that provides a simple, highly interactive interface for carrying out certain mathematical tasks that are commonly encountered in physics. These tasks include plotting and animating functions, solving systems of coupled algebraic equations, and basic calculus (differentiating and integrating…

  6. SPECTRW: A software package for nuclear and atomic spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kalfas, C.A., E-mail: kalfas@inp.demokritos.gr [National Centre for Scientific Research Demokritos, Institute of Nuclear & Particle Physics, 15310 Agia Paraskevi, Attiki (Greece); Axiotis, M. [National Centre for Scientific Research Demokritos, Institute of Nuclear & Particle Physics, 15310 Agia Paraskevi, Attiki (Greece); Tsabaris, C. [Hellenic Centre for Marine Research, Institute of Oceanography, 46.7 Km Athens-Sounio Ave, P.O. Box 712, Anavyssos 19013 (Greece)

    2016-09-11

    A software package to be used in nuclear and atomic spectroscopy is presented. Apart from analyzing γ and X-ray spectra, it offers many additional features such as de-convolution of multiple photopeaks, sample analysis and activity determination, detection system evaluation and an embedded code for spectra simulation.

  7. Integrated software package for laser diodes characterization

    Science.gov (United States)

    Sporea, Dan G.; Sporea, Radu A.

    2003-10-01

    The characteristics of laser diodes (wavelength of the emitted radiation, output optical power, embedded photodiode photocurrent, threshold current, serial resistance, external quantum efficiency) are strongly influenced by their driving circumstances (forward current, case temperature). In order to handle such a complex investigation in an efficient and objective manner, the operation of several instruments (a laser diode driver, a temperature controller, a wavelength meter, a power meter, and a laser beam analyzer) is synchronously controlled by a PC, through serial and GPIB communication. For each equipment, instruments drivers were designed using the industry standards graphical programming environment - LabVIEW from National Instruments. All the developed virtual instruments operate under the supervision of a managing virtual instrument, which sets the driving parameters for each unit under test. The manager virtual instrument scans as appropriate the driving current and case temperature values for the selected laser diode. The software enables data saving in Excel compatible files. In this way, sets of curves can be produced according to the testing cycle needs.

  8. Photogrammetry Software. A Package for Everyone,

    Science.gov (United States)

    1981-10-01

    orientat ion of the model -ince the :l i of rotation f or o:ega and phi intersect the 1)rojection cardan . Tihe perspective centers change oniy if the...rota- tion for omega and phi do not intersect the projection cardan . If the perspective center is to be determined after each model relative orientation

  9. Simulating water, solute, and heat transport in the subsurface with the VS2DI software package

    Science.gov (United States)

    Healy, R.W.

    2008-01-01

    The software package VS2DI was developed by the U.S. Geological Survey for simulating water, solute, and heat transport in variably saturated porous media. The package consists of a graphical preprocessor to facilitate construction of a simulation, a postprocessor for visualizing simulation results, and two numerical models that solve for flow and solute transport (VS2DT) and flow and heat transport (VS2DH). The finite-difference method is used to solve the Richards equation for flow and the advection-dispersion equation for solute or heat transport. This study presents a brief description of the VS2DI package, an overview of the various types of problems that have been addressed with the package, and an analysis of the advantages and limitations of the package. A review of other models and modeling approaches for studying water, solute, and heat transport also is provided. ?? Soil Science Society of America. All rights reserved.

  10. Versatile Software Package For Near Real-Time Analysis of Experimental Data

    Science.gov (United States)

    Wieseman, Carol D.; Hoadley, Sherwood T.

    1998-01-01

    This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.

  11. The khmer software package: enabling efficient nucleotide sequence analysis.

    Science.gov (United States)

    Crusoe, Michael R; Alameldin, Hussien F; Awad, Sherine; Boucher, Elmar; Caldwell, Adam; Cartwright, Reed; Charbonneau, Amanda; Constantinides, Bede; Edvenson, Greg; Fay, Scott; Fenton, Jacob; Fenzl, Thomas; Fish, Jordan; Garcia-Gutierrez, Leonor; Garland, Phillip; Gluck, Jonathan; González, Iván; Guermond, Sarah; Guo, Jiarong; Gupta, Aditi; Herr, Joshua R; Howe, Adina; Hyer, Alex; Härpfer, Andreas; Irber, Luiz; Kidd, Rhys; Lin, David; Lippi, Justin; Mansour, Tamer; McA'Nulty, Pamela; McDonald, Eric; Mizzi, Jessica; Murray, Kevin D; Nahum, Joshua R; Nanlohy, Kaben; Nederbragt, Alexander Johan; Ortiz-Zuazaga, Humberto; Ory, Jeramia; Pell, Jason; Pepe-Ranney, Charles; Russ, Zachary N; Schwarz, Erich; Scott, Camille; Seaman, Josiah; Sievert, Scott; Simpson, Jared; Skennerton, Connor T; Spencer, James; Srinivasan, Ramakrishnan; Standage, Daniel; Stapleton, James A; Steinman, Susan R; Stein, Joe; Taylor, Benjamin; Trimble, Will; Wiencko, Heather L; Wright, Michael; Wyss, Brian; Zhang, Qingpeng; Zyme, En; Brown, C Titus

    2015-01-01

    The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at  https://github.com/dib-lab/khmer/.

  12. MATEO: a software package for the molecular design of energetic materials.

    Science.gov (United States)

    Mathieu, Didier

    2010-04-15

    To satisfy the need of energetic materials chemists for reliable and efficient predictive tools in order to select the most promising candidates for synthesis, a custom software package is developed. Making extensive use of publicly available software, it integrates a wide range of models and can be used for a variety of tasks, from the calculation of molecular properties to the prediction of the performance of heterogeneous materials, such as propellant compositions based on ammonium perchlorate/aluminium mixtures. The package is very easy to use through a graphical desktop environment. According to the material provided as input, suitable models and parameters are automatically selected. Therefore, chemists can apply advanced predictive models without having to learn how to use complex computer codes. To make the package more versatile, a command-line interface is also provided. It facilitates the assessment of various procedures by model developers.

  13. Maximize Your Investment 10 Key Strategies for Effective Packaged Software Implementations

    CERN Document Server

    Beaubouef, Grady Brett

    2009-01-01

    This is a handbook covering ten principles for packaged software implementations that project managers, business owners, and IT developers should pay attention to. The book also has practical real-world coverage including a sample agenda for conducting business solution modeling, customer case studies, and a road map to implement guiding principles. This book is aimed at enterprise architects, development leads, project managers, business systems analysts, business systems owners, and anyone who wants to implement packaged software effectively. If you are a customer looking to implement COTS s

  14. Parallel Software Model Checking

    Science.gov (United States)

    2015-01-08

    JAN 2015 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Parallel Software Model Checking 5a. CONTRACT NUMBER 5b. GRANT NUMBER...AND ADDRESS(ES) Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9...3: ∧ ≥ 10 ∧ ≠ 10 ⇒ : Parallel Software Model Checking Team Members Sagar Chaki, Arie Gurfinkel

  15. Application of CyboCon Advanced Adjustment and Control Software Package in Delayed Coking Unit

    Institute of Scientific and Technical Information of China (English)

    Guo Hua

    2002-01-01

    This article refers to application of the CyboCon software package based upon the model-free adaptive control (MFA) in the 800-kt/a delayed coking unit to realize an advanced adjustment and control strategy for the temperature control of the heater. Operation tests have revealed the convenience in operating system and simplicity in maintenance, leading to good economic benefits.

  16. Application of modern software packages to calculating the solidification of high-speed steels

    Science.gov (United States)

    Morozov, S. I.

    2015-12-01

    The solidification of high-speed steels is calculated with the Pandat and JMatPro software packages. The results of calculating equilibrium and nonequilibrium solidification are presented and discussed. The nonequilibrium solidification is simulated using the Shelley-Gulliver model. The fraction of carbides changes as a function of the carbon content in the steels.

  17. Software Package Completed for Alloy Design at the Atomic Level

    Science.gov (United States)

    Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.

    2001-01-01

    As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.

  18. An interactive software package for validating satellite data

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Pankajakshan, T.

    of this relationship. This would enable one to understand the significance of such changes in view of noticeable environmental perturbations. This is essential for any validation exercise as the satellite often retrieves the skin temperature and is quite sensitive... skin and multichannel SST. J. Geophys. Res., 97, 5569 - 5595, 1992. [6] Page 12 of 12Gayana (Concepción) - AN INTERACTIVE SOFTWARE PACKAGE FOR VAL... 8/11/2006http://www.scielo.cl/scielo.php?script=sci_arttext&pid=S0717-65382004000300018&lng=... ...

  19. Global review of open access risk assessment software packages valid for global or continental scale analysis

    Science.gov (United States)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user

  20. IT LONG TAIL STRATEGY FOR SOFTWARE PACKAGE COMPANY

    Directory of Open Access Journals (Sweden)

    Andreas Winata

    2010-10-01

    Full Text Available Long Tail Strategy is a business strategy which explains that the total revenue from the sale of non-popular products may exceed the total income from popular products. This may happen since generally there is only a small number of popular products, which is in great demand, while there are many of the non-popular types which is sold in small amounts. This research aims to better understand the role of IT behind the success of the Long Tail strategy. Results show stages of how to develop IT strategy, including identification, analysis, determines on a strategy, until implementation. The results of this study will help software developers to plan IT strategy by implementing an accurate Long Tail Strategy.Keywords: Long Tail, IT Strategy, Services, Software Package

  1. PBM: a software package to create, display and manipulate interactively models of small molecules and proteins on IBM-compatible PCs.

    Science.gov (United States)

    Perrakis, A; Constantinides, C; Athanasiades, A; Hamodrakas, S J

    1995-04-01

    The PBM package was developed to create, display and conveniently manipulate protein and small molecule structures on IBM-compatible microcomputers. It consists of four modules: CREATE, SPHERE, RIBBON and CONVERT. CREATE includes commands to create or alter ('mutate') the primary and subsequently the tertiary structure of a given peptide or protein by defining phi and psi angles of residues at will, options to add, delete or alter atoms in a structure, utilities to choose easily between the most common rotamers of amino acid residue sidechains and options to analyse in various ways a protein conformation. SPHERE provides for an interactive manipulation of structures containing up to 2700 atoms which can belong up to six different molecules. All manipulations can be made with the use of an ordinary mouse, by choosing from a variety of pull-down menus. Three types of models can be implemented to display molecules on the computer screen or the plotter: skeletal, solid space-filling and wireframe space-filling models. RIBBON creates ribbon models of proteins and allows for a limited variety of interactive manipulations. CONVERT is a file converter, which is capable of converting files of atom coordinates of literally any format to Brookhaven Data Bank format files. The package produces very good results for protein molecules of reasonable sizes, both in terms of graphics quality and speed of operations, on an 80486 IBM PC-compatible machine equipped with a 1 MByte VGA display card and a colour VGA monitor, which is a recommended configuration.

  2. Development of the Monte Carlo event generator tuning software package Lagrange and its application to tune the PYTHIA model to the LHCb data

    CERN Document Server

    Popov, Dmitry; Hofmann, Werner

    One of the general problems of modern high energy physics is a problem of comparing experimental data, measurements of observables in high energy collisions, to theory, which is represented by Monte Carlo simulations. This work is dedicated to further development of the tuning methodology and implementation of software tools for tuning of the PYTHIA Monte Carlo event generator for the LHCb experiment. The aim of this thesis is to create a fast analytical model of the Monte Carlo event generator and then fitting the model to the experimental data, recorded by the LHCb detector, considering statistical and computational uncertainties and estimating the best values for the tuned parameters, by simultaneous tuning of a group of phenomenological parameters in many-dimensional parameter-space. The fitting algorithm is interfaced to the LHCb software framework, which models the response of the LHCb detector. Typically, the tunings are done to the measurements which are corrected for detector effects. These correctio...

  3. GEMBASSY: an EMBOSS associated software package for comprehensive genome analyses.

    Science.gov (United States)

    Itaya, Hidetoshi; Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru

    2013-08-29

    The popular European Molecular Biology Open Software Suite (EMBOSS) currently contains over 400 tools used in various bioinformatics researches, equipped with sophisticated development frameworks for interoperability and tool discoverability as well as rich documentations and various user interfaces. In order to further strengthen EMBOSS in the fields of genomics, we here present a novel EMBOSS associated software (EMBASSY) package named GEMBASSY, which adds more than 50 analysis tools from the G-language Genome Analysis Environment and its Representational State Transfer (REST) and SOAP web services. GEMBASSY basically contains wrapper programs of G-language REST/SOAP web services to provide intuitive and easy access to various annotations within complete genome flatfiles, as well as tools for analyzing nucleic composition, calculating codon usage, and visualizing genomic information. For example, analysis methods such as for calculating distance between sequences by genomic signatures and for predicting gene expression levels from codon usage bias are effective in the interpretation of meta-genomic and meta-transcriptomic data. GEMBASSY tools can be used seamlessly with other EMBOSS tools and UNIX command line tools. The source code written in C is available from GitHub (https://github.com/celery-kotone/GEMBASSY/) and the distribution package is freely available from the GEMBASSY web site (http://www.g-language.org/gembassy/).

  4. USING THE SOFTWARE MICROSOFT OFFICE EXCEL FOR FINANCIAL MODELING DECISION

    Directory of Open Access Journals (Sweden)

    Bălăc escu Aniela

    2009-05-01

    Full Text Available The use of the software package is today indispensable for modeling of financial decisions. Business organizations will invariably make greater demands of the software than individual users. Excel is an option along with other software applications tailored to the market, and bespoke (in-house software packages. It should be noted that Excel is the leader in the market and as such does set a benchmark.

  5. Does HDR Pre-Processing Improve the Accuracy of 3D Models Obtained by Means of two Conventional SfM-MVS Software Packages? The Case of the Corral del Veleta Rock Glacier

    Directory of Open Access Journals (Sweden)

    Álvaro Gómez-Gutiérrez

    2015-08-01

    Full Text Available The accuracy of different workflows using Structure-from-Motion and Multi-View-Stereo techniques (SfM-MVS is tested. Twelve point clouds of the Corral del Veleta rock glacier, in Spain, were produced with two different software packages (123D Catch and Agisoft Photoscan, using Low Dynamic Range images and High Dynamic Range compositions (HDR for three different years (2011, 2012 and 2014. The accuracy of the resulting point clouds was assessed using benchmark models acquired every year with a Terrestrial Laser Scanner. Three parameters were used to estimate the accuracy of each point cloud: the RMSE, the Cloud-to-Cloud distance (C2C and the Multiscale-Model-to-Model comparison (M3C2. The M3C2 mean error ranged from 0.084 m (standard deviation of 0.403 m to 1.451 m (standard deviation of 1.625 m. Agisoft Photoscan overcome 123D Catch, producing more accurate and denser point clouds in 11 out 12 cases, being this work, the first available comparison between both software packages in the literature. No significant improvement was observed using HDR pre-processing. To our knowledge, this is the first time that the geometrical accuracy of 3D models obtained using LDR and HDR compositions are compared. These findings may be of interest for researchers who wish to estimate geomorphic changes using SfM-MVS approaches.

  6. Accuracy of Giovanni and Marksim Software Packages for ...

    African Journals Online (AJOL)

    Agricultural adaptation to climate change requires accurate, unbiased, and reliable climate data. ... simulation models are important tools for generating rainfall data in areas with limited or no .... software has a global partial-temporal coverage.

  7. SIMODIS - a software package for simulating nuclear reactor components

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Lamartine; Borges, Eduardo M. [Centro Tecnico Aeroespacial (CTA-IEAv), Sao Jose dos Campos, SP (Brazil). Inst. de Estudos Avancados. E-mail: guimarae@ieav.cta.br; Oliveira Junior, Nilton S.; Santos, Glauco S.; Bueno, Mariana F. [Universidade Bras Cubas, Mogi das Cruzes, SP (Brazil)

    2000-07-01

    In this paper it is presented the initial development effort in building a nuclear reactor component simulation package. This package was developed to be used in the MATLAB simulation environment. It uses the graphical capabilities from MATLAB and the advantages of compiled languages, as for instance FORTRAN and C{sup ++}. From the MATLAB it takes the facilities for better displaying the calculated results. From the compiled languages it takes processing speed. So far models from reactor core, UTSG and OTSG have been developed. Also, a series a user-friendly graphical interfaces have been developed for the above models. As a by product a set of water and sodium thermal and physical properties have been developed and may be used directly as a function from MATLAB, or by being called from a model, as part of its calculation process. The whole set was named SIMODIS, which stands for SIstema MODular Integrado de Simulacao. (author)

  8. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages.

  9. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Science.gov (United States)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  10. User's Guide for the MapImage Reprojection Software Package, Version 1.01

    Science.gov (United States)

    Finn, Michael P.; Trent, Jason R.

    2004-01-01

    Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets (such as 30-m data) for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Recently, Usery and others (2003a) expanded on the previously limited empirical work with real geographic data by compiling and tabulating the accuracy of categorical areas in projected raster datasets of global extent. Geographers and applications programmers at the U.S. Geological Survey's (USGS) Mid-Continent Mapping Center (MCMC) undertook an effort to expand and evolve an internal USGS software package, MapImage, or mapimg, for raster map projection transformation (Usery and others, 2003a). Daniel R. Steinwand of Science Applications International Corporation, Earth Resources Observation Systems Data Center in Sioux Falls, S. Dak., originally developed mapimg for the USGS, basing it on the USGS's General Cartographic Transformation Package (GCTP). It operated as a command line program on the Unix operating system. Through efforts at MCMC, and in coordination with Mr. Steinwand, this program has been transformed from an application based on a command line into a software package based on a graphic user interface for Windows, Linux, and Unix machines. Usery and others (2003b) pointed out that many commercial software packages do not use exact projection equations and that even when exact projection equations are used, the software often results in error and sometimes does not complete the transformation for specific projections, at specific resampling resolutions, and for specific singularities. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in these software packages, but implementation with data other than points requires specific adaptation of the equations or prior preparation of the data to allow the transformation to succeed. Additional

  11. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  12. US Army Radiological Bioassay and Dosimetry: The RBD software package

    Energy Technology Data Exchange (ETDEWEB)

    Eckerman, K. F.; Ward, R. C.; Maddox, L. B.

    1993-01-01

    The RBD (Radiological Bioassay and Dosimetry) software package was developed for the U. S. Army Material Command, Arlington, Virginia, to demonstrate compliance with the radiation protection guidance 10 CFR Part 20 (ref. 1). Designed to be run interactively on an IBM-compatible personal computer, RBD consists of a data base module to manage bioassay data and a computational module that incorporates algorithms for estimating radionuclide intake from either acute or chronic exposures based on measurement of the worker's rate of excretion of the radionuclide or the retained activity in the body. In estimating the intake,RBD uses a separate file for each radionuclide containing parametric representations of the retention and excretion functions. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent. For a given nuclide, if measurements exist for more than one type of assay, an auxiliary module, REPORT, estimates the intake by applying weights assigned in the nuclide file for each assay. Bioassay data and computed results (estimates of intake and committed dose equivalent) are stored in separate data bases, and the bioassay measurements used to compute a given result can be identified. The REPORT module creates a file containing committed effective dose equivalent for each individual that can be combined with the individual's external exposure.

  13. PALSfit3: A software package for analysing positron lifetime spectra

    DEFF Research Database (Denmark)

    Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard

    been used extensively by the positron annihilation community. The present document describes the  mathematical foundation of the PALSfit3 model as well as a number of features of the program. The cornerstones of PALSfit3 are two least squares fitting modules: POSITRONFIT and RESOLUTIONFIT. In both...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)...

  14. JPI UML Software Modeling

    Directory of Open Access Journals (Sweden)

    Cristian Vidal Silva

    2015-12-01

    Full Text Available Aspect-Oriented Programming AOP extends object-oriented programming OOP with aspects to modularize crosscutting behavior on classes by means of aspects to advise base code in the occurrence of join points according to pointcut rules definition. However, join points introduce dependencies between aspects and base code, a great issue to achieve an effective independent development of software modules. Join Point Interfaces JPI represent join points using interfaces between classes and aspect, thus these modules do not depend of each other. Nevertheless, since like AOP, JPI is a programming methodology; thus, for a complete aspect-oriented software development process, it is necessary to define JPI requirements and JPI modeling phases. Towards previous goal, this article proposes JPI UML class and sequence diagrams for modeling JPI software solutions. A purpose of these diagrams is to facilitate understanding the structure and behavior of JPI programs. As an application example, this article applies the JPI UML diagrams proposal on a case study and analyzes the associated JPI code to prove their hegemony.

  15. A Software Package Using a Mesh-grid Method for Simulating HPGe Detector Efficiencies

    Energy Technology Data Exchange (ETDEWEB)

    Kevin Jackman

    2009-10-01

    Traditional ways of determining the absolute full-energy peak efficiencies of high-purity germanium (HPGe) detectors are often time consuming, cost prohibitive, or not feasible. A software package, KMESS (Kevin’s Mesh Efficiency Simulator Software), was developed to assist in predicting these efficiencies. It uses a semiempirical mesh-grid method and works for arbitrary source shapes and counting geometries. The model assumes that any gamma-ray source shape can be treated as a large enough collection of point sources. The code is readily adaptable, has a web-based graphical front-end, and could easily be coupled to a 3D scanner. As will be shown, this software can estimate absolute full-energy peak efficiencies with good accuracy in reasonable computation times. It has applications to the field of gamma-ray spectroscopy because it is a quick and accurate way to assist in performing quantitative analyses using HPGe detectors.

  16. A software package using a mesh-grid method for simulating HPGe detector efficiencies

    Energy Technology Data Exchange (ETDEWEB)

    Gritzo, Russell E [Los Alamos National Laboratory; Jackman, Kevin R [REMOTE SENSING LAB; Biegalski, Steven R [UT AUSTIN

    2009-01-01

    Traditional ways of determining the absolute full-energy peak efficiencies of high-purity germanium (HPGe) detectors are often time consuming, cost prohibitive, or not feasible. A software package, KMESS (Kevin's Mesh Efficiency Simulator Software), was developed to assist in predicting these efficiencies. It uses a semiempirical mesh-grid method and works for arbitrary source shapes and counting geometries. The model assumes that any gamma-ray source shape can be treated as a large enough collection of point sources. The code is readily adaptable, has a web-based graphical front-end. and could easily be coupled to a 3D scanner. As will be shown. this software can estimate absolute full-energy peak efficiencies with good accuracy in reasonable computation times. It has applications to the field of gamma-ray spectroscopy because it is a quick and accurate way to assist in performing quantitative analyses using HPGe detectors.

  17. Model-based Software Engineering

    DEFF Research Database (Denmark)

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  18. Model-based Software Engineering

    DEFF Research Database (Denmark)

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  19. An Overview on R Packages for Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Haibin Qiu

    2014-05-01

    Full Text Available The aim of this study is to present overview on R packages for structural equation modeling. Structural equation modeling, a statistical technique for testing and estimating causal relations using an amalgamation of statistical data and qualitative causal hypotheses, allow both confirmatory and exploratory modeling, meaning they are matched to both hypothesis testing and theory development. R project or R language, a free and popular programming language and computer software surroundings for statistical computing and graphics, is popularly used among statisticians for developing statistical computer software and data analysis. The major finding is that it is necessary to build excellent and enough structural equation modeling packages for R users to do research. Numerous packages for structural equation modeling of R project are introduced in this study and most of them are enclosed in the Comprehensive R Archive Network task view Psychometrics.

  20. Features of free software packages in flow cytometry: a comparison between four non-commercial software sources.

    Science.gov (United States)

    Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh

    2014-08-01

    Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.

  1. Desire characteristics of a generic 'no frills' software engineering tools package

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, J.J.

    1986-07-29

    Increasing numbers of vendors are developing software engineering tools to meet the demands of increasingly complex software systems, higher reliability goals for software products, higher programming labor costs, and management's desire to more closely associate software lifecycle costs with the estimated development schedule. Some vendors have chosen a dedicated workstation approach to achieve high user interactivity through windowing and mousing. Other vendors are using multi-user mainframes with low cost terminals to economize on the costs of the hardware and the tools software. For all of the potential customers of software tools, the question remains: What are the minimum functional requirements that a software engineering tools package must have in order to be considered useful throughout the entire software lifecycle. This paper describes the desired characteristics of a non-existent but realistic 'no frills' software engineering tools package. 3 refs., 5 figs.

  2. The libRadtran software package for radiative transfer calculations (version 2.0.1)

    Science.gov (United States)

    Emde, Claudia; Buras-Schnell, Robert; Kylling, Arve; Mayer, Bernhard; Gasteiger, Josef; Hamann, Ulrich; Kylling, Jonas; Richter, Bettina; Pause, Christian; Dowling, Timothy; Bugliaro, Luca

    2016-05-01

    libRadtran is a widely used software package for radiative transfer calculations. It allows one to compute (polarized) radiances, irradiance, and actinic fluxes in the solar and thermal spectral regions. libRadtran has been used for various applications, including remote sensing of clouds, aerosols and trace gases in the Earth's atmosphere, climate studies, e.g., for the calculation of radiative forcing due to different atmospheric components, for UV forecasting, the calculation of photolysis frequencies, and for remote sensing of other planets in our solar system. The package has been described in Mayer and Kylling (2005). Since then several new features have been included, for example polarization, Raman scattering, a new molecular gas absorption parameterization, and several new parameterizations of cloud and aerosol optical properties. Furthermore, a graphical user interface is now available, which greatly simplifies the usage of the model, especially for new users. This paper gives an overview of libRadtran version 2.0.1 with a focus on new features. Applications including these new features are provided as examples of use. A complete description of libRadtran and all its input options is given in the user manual included in the libRadtran software package, which is freely available at http://www.libradtran.org.

  3. GET_HOMOLOGUES, a versatile software package for scalable and robust microbial pangenome analysis.

    Science.gov (United States)

    Contreras-Moreira, Bruno; Vinuesa, Pablo

    2013-12-01

    GET_HOMOLOGUES is an open-source software package that builds on popular orthology-calling approaches making highly customizable and detailed pangenome analyses of microorganisms accessible to nonbioinformaticians. It can cluster homologous gene families using the bidirectional best-hit, COGtriangles, or OrthoMCL clustering algorithms. Clustering stringency can be adjusted by scanning the domain composition of proteins using the HMMER3 package, by imposing desired pairwise alignment coverage cutoffs, or by selecting only syntenic genes. The resulting homologous gene families can be made even more robust by computing consensus clusters from those generated by any combination of the clustering algorithms and filtering criteria. Auxiliary scripts make the construction, interrogation, and graphical display of core genome and pangenome sets easy to perform. Exponential and binomial mixture models can be fitted to the data to estimate theoretical core genome and pangenome sizes, and high-quality graphics can be generated. Furthermore, pangenome trees can be easily computed and basic comparative genomics performed to identify lineage-specific genes or gene family expansions. The software is designed to take advantage of modern multiprocessor personal computers as well as computer clusters to parallelize time-consuming tasks. To demonstrate some of these capabilities, we survey a set of 50 Streptococcus genomes annotated in the Orthologous Matrix (OMA) browser as a benchmark case. The package can be downloaded at http://www.eead.csic.es/compbio/soft/gethoms.php and http://maya.ccg.unam.mx/soft/gethoms.php.

  4. Anukalpana 2.0: A Performance Evaluation Software Package for Akash Surface to Air Missile System

    Directory of Open Access Journals (Sweden)

    G.S. Raju

    1997-07-01

    Full Text Available Abstract : "An air defence system is a complex dynamic system comprising sensors, control centres, launchers and missiles. Practical evaluation of such a complex system is almost impossible and very expensive. Further, during development of the system, there is a necessity to evaluate certain design characteristics before it is implemented. Consequently, need arises for a comprehensive simulation package which will simulate various subsystems of the air defence weapon system, so that performance of the system can be evaluated. With the above objectives in mind, a software package, called Anukalpana 2.0, has been developed. The first version of the package was developed at the Indian Institute of Science, Bangalore. This program has been subsequently updated. The main objectives of this package are: (i evaluation of the performance of Akash air defence system and other similar air defence systems against any specified aerial threat, (ii investigation of effectiveness of the deployment tactics and operational logic employed at the firing batteries and refining them, (iii provision of aid for refining standard operating procedures (SOPs for the multitarget defence, and (iv exploring the possibility of using it as a user training tool at the level of Air Defence Commanders. The design specification and the simulation/modelling philosophy adopted for the development of this package are discussed at length. Since Akash air defence system has many probabilistic events, Monte Carlo method of simulation is used for both threat and defence. Implementation details of the package are discussed in brief. These include: data flow diagrams and interface details. Analysis of results for certain input cases is also covered."

  5. Modelling of a DNA packaging motor

    Institute of Scientific and Technical Information of China (English)

    Qian Jun; Xie Ping; Xue Xiao-Guang; Wang Peng-Ye

    2009-01-01

    During the assembly of many viruses, a powerful molecular motor packages the genome into a preassembled capsid. The Bacillus subtilis phage φ29 is an excellent model system to investigate the DNA packaging mechanism because of its highly efficient in vitro DNA packaging activity and the development of a single-molecule packaging assay. Here we make use of structural and biochemical experimental data to build a physical model of DNA packaging by the φ29 DNA packaging motor. Based on the model, various dynamic behaviours such as the packaging rate, pause frequency and slip frequency under different ATP concentrations, ADP concentrations, external loads as well as capsid fillings are studied by using Monte Carlo simulation. Good agreement is obtained between the simulated and available experimental results. Moreover, we make testable predictions that should guide future experiments related to motor function.

  6. Linear latent variable models: the lava-package

    DEFF Research Database (Denmark)

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features are...... interface covering a broad range of non-linear generalized structural equation models is described. The model and software are demonstrated in data of measurements of the serotonin transporter in the human brain....

  7. TGF electron avalanches and gamma-ray emission with LEPTRACK - a new detailed simulation software package

    Science.gov (United States)

    Connell, Paul

    2014-05-01

    In designing the MXGS coded mask imager of the ASIM mission on the ISS, to detect and locate gamma-rays from Terrestrial Gamma-ray Flashes, it was necessary to write software to simulate the expansion of gamma-ray photons from 15-20 km altitudes for an initial estimate of TGF spectra and diffuse beam structure likely to be observed at orbital altitudes. From this a new detailed LEPTRACK simulation software package has been developed to track all electron-photon scattering via Bremsstrahlung and ionization, and via any spatial electric-magnetic field geometies which will drive the Relativistic Runaway Electron Avalanche (RREA) process at the heart of TGF origin. LEPTRACK uses the standard physics of keV-MeV photon interactions, Bremsstrahlung scattering, Binary-Electron-Bethe models of electron ionization-scattering, positron Bhabha scattering and annihilation. Unlike simulation packages GEANT4, EGS, etc, the physics of these processes is transferred outside the software and controlled by a standard database of text files of total scattering cross sections, differential energy transfer and deflection angle PDFs - easy to read and plot - but which can also be changed, if the user understands the physics involved and wishes to create their own modified database. It also uses a superparticle spatial mesh system to control particle density and flux fields, electric field evolution, and exponential avalanche growth. Results will be presented of TGF simulations using macro electric field geometries expected in storm clouds and micro field geometries expected around streamer tips - and combinations of both - and will include video displays showing the evolving ionization structure of electron trajectories, the time evolution of photon-electron-positron density and flux fields, local molecular ion densities, the dielectric effect of induced local electric fields - and the important effect of the local earth magnetic field on circular lepton feedback and TGF beam direction

  8. The Model 9977 Radioactive Material Packaging Primer

    Energy Technology Data Exchange (ETDEWEB)

    Abramczyk, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-09

    The Model 9977 Packaging is a single containment drum style radioactive material (RAM) shipping container designed, tested and analyzed to meet the performance requirements of Title 10 the Code of Federal Regulations Part 71. A radioactive material shipping package, in combination with its contents, must perform three functions (please note that the performance criteria specified in the Code of Federal Regulations have alternate limits for normal operations and after accident conditions): Containment, the package must “contain” the radioactive material within it; Shielding, the packaging must limit its users and the public to radiation doses within specified limits; and Subcriticality, the package must maintain its radioactive material as subcritical

  9. Software and hardware package for justification of safety of nuclear legacy facilities

    Directory of Open Access Journals (Sweden)

    P.A. Blokhin

    2017-03-01

    Full Text Available Determination of future fate for nuclear legacy facilities is becoming an extremely important near-term issue. This includes decommissioning options to be identified based on detailed justifications of respective designs. No general practice has been developed in Russia to address such issues, while the initial steps to this end have been made as part of the federal target program “Ensuring Nuclear and Radiation Safety for 2008 and Up to the Year 2015”. Problems arising in justification of decommissioning options for such facilities, in terms of radiation protection and safety assessments both for the public and personnel, differ greatly from tasks involved in design of new nuclear installations. The explanation is a critical shortage of information on both nuclear legacy facilities as such and on the RW they contain. Extra complexities stem from regulatory requirements to facilities of this type having changed greatly since the time these facilities were built. This puts priority on development of approaches to justification of nuclear, radiation and environmental safety. A software and hardware package, OBOYAN, has been developed to solve a great variety of tasks to be addressed as part of this problem based on a combination of software and hardware tools enabling analysis and justification of the NLS safety in their current state and in a long term. The package's key components are computational modules used to model radiation fields, radionuclide migration and distribution of contamination in water and air, as well as to estimate human doses and risks. The purpose of the study is to describe the structure and the functional capabilities of the package and to provide examples of the package application.

  10. A Mathematical Model of Versatile Energy Storage System and Its Modeling by Power System Analysis Software Package%通用储能系统数学模型及其PSASP建模研究

    Institute of Scientific and Technical Information of China (English)

    李妍; 荆盼盼; 王丽; 许轶珊; 杨增涛; 张步涵; 毛承雄

    2012-01-01

    With the increasing application of PV and wind power, special attention is being paid to energy storage system, which is regarded as an important manner to smooth power fluctuation. Reasonable layout of energy storage systems become an important issue to enhance the ability of power grid to accept the new energy sources. In order to study the impacts of various energy storage systems on electromechanical transient response of power grid, a versatile energy storage system model suitable to the electromechanical transient calculation of power grid is proposed, by which the characteristic parameters of energy storage system such as time-delay of system resoponse, limitation of charging and discharging and restraint of energy storage capacity can be considered, the effectiveness of the proposed model is verified by user-defined modules in Power System Analysis Software Package (PSASP), a study on the impact of sudden change of output of PV power generation on frequency of the example system is simulated, the suggestion on layout of energy storage systems is given.%随着光伏、风能发电等并网发电容量的不断增加,储能系统作为平抑功率波动的重要手段被广泛关注。研究合理的储能系统配置,改善系统暂态响应,成为增强电网对新能源发电接纳能力的重要技术手段。为研究不同储能系统对电网机电暂态响应的影响,提出了一种适用于电力系统机电暂态计算的通用储能系统模型。该模型可以体现储能系统响应时延、充放电功率限制、储能容量限制等储能系统特性参数。通过电力系统综合分析软件PSASP的用户自定义模型UD模块进行自定义建模仿真,分析验证了该模型的有效性,并对算例系统中光伏发电出力变化对弱电网的系统频率造成的影响进行了仿真分析,并提出了储能系统配置的相关建议。

  11. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  12. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    Directory of Open Access Journals (Sweden)

    Sang-Kyu Jung

    Full Text Available Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  13. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...for specific projects. L5: Analyze assurance technologies and contribute to the development of new ones. Assured Software Development L1

  14. Photovoltaics software package. Simulation, design and calculation software for photovoltaics; Softwarepaket Photovoltaik. Simulations-, Auslegungs- und Berechnungsprogramme fuer die Photovoltaik

    Energy Technology Data Exchange (ETDEWEB)

    Haas, Rudolf; Weinreich, Bernhard

    2007-07-01

    The software package comprises simulation, design and calculation tools: Professional configuration of photovoltaic systems; Design and optimization of PV systems and components; 3D visualization of shading situations; Economic efficiency and profit calculations; Software status replort; Measuring technology for characteristics, insolation, infrared radiation, etc.; Databases for modules, inverters and supports; Insolation maps for Germany dating back to 1998; Check lists: Site, diemensioning, comparison of systems, etc.; Useful addresses, bibliography, manufacturers; Other renewable energy sources, and much more. (orig.)

  15. A software package for the full GBTX lifecycle

    CERN Document Server

    Feger, S; Marin, M Barros; Leitao, P; Moreira, P; Porret, D; Wyllie, K

    2015-01-01

    This work presents the software environment surrounding the GBTX. The GBTX is a high speed bidirectional ASIC, implementing radiation hard optical links for high-energy physics experiments. Having more than 300 8-bit configuration registers, it poses challenges addressed by a wide variety of software components. This paper focuses on the software used for characterization as well as radiation and production testing of the GBTX. It also highlights tools made available to the designers and users, enabling them to create customized configurations. The paper shows how storing data for the full GBTX lifecycle is planned to ensure a good quality tracking of their devices.

  16. A software package for the full GBTX lifecycle

    CERN Document Server

    Feger, S; Marin, M Barros; Leitao, P; Moreira, P; Porret, D; Wyllie, K

    2015-01-01

    This work presents the software environment surrounding the GBTX. The GBTX is a high speed bidirectional ASIC, implementing radiation hard optical links for high-energy physics experiments. Having more than 300 8-bit configuration registers, it poses challenges addressed by a wide variety of software components. This paper focuses on the software used for characterization as well as radiation and production testing of the GBTX. It also highlights tools made available to the designers and users, enabling them to create customized configurations. The paper shows how storing data for the full GBTX lifecycle is planned to ensure a good quality tracking of their devices.

  17. Effective organizational solutions for implementation of DBMS software packages

    Science.gov (United States)

    Jones, D.

    1984-01-01

    The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.

  18. Investigating the effects of different factors on development of open source enterprise resources planning software packages

    Directory of Open Access Journals (Sweden)

    Mehdi Ghorbaninia

    2014-08-01

    Full Text Available This paper investigates the effects of different factors on development of open source enterprise resources planning software packages. The study designs a questionnaire in Likert scale and distributes it among 210 experts in the field of open source software package development. Cronbach alpha has been calculated as 0.93, which is well above the minimum acceptable level. Using Pearson correlation as well as stepwise regression analysis, the study determines three most important factors including fundamental issues, during and after implementation of open source software development. The study also determines a positive and strong relationship between fundamental factors and after implementation factors (r=0.9006, Sig. = 0.000.

  19. Evaluating Dense 3d Reconstruction Software Packages for Oblique Monitoring of Crop Canopy Surface

    Science.gov (United States)

    Brocks, S.; Bareth, G.

    2016-06-01

    Crop Surface Models (CSMs) are 2.5D raster surfaces representing absolute plant canopy height. Using multiple CMSs generated from data acquired at multiple time steps, a crop surface monitoring is enabled. This makes it possible to monitor crop growth over time and can be used for monitoring in-field crop growth variability which is useful in the context of high-throughput phenotyping. This study aims to evaluate several software packages for dense 3D reconstruction from multiple overlapping RGB images on field and plot-scale. A summer barley field experiment located at the Campus Klein-Altendorf of University of Bonn was observed by acquiring stereo images from an oblique angle using consumer-grade smart cameras. Two such cameras were mounted at an elevation of 10 m and acquired images for a period of two months during the growing period of 2014. The field experiment consisted of nine barley cultivars that were cultivated in multiple repetitions and nitrogen treatments. Manual plant height measurements were carried out at four dates during the observation period. The software packages Agisoft PhotoScan, VisualSfM with CMVS/PMVS2 and SURE are investigated. The point clouds are georeferenced through a set of ground control points. Where adequate results are reached, a statistical analysis is performed.

  20. Calculation of the relative metastabilities of proteins using the CHNOSZ software package

    Directory of Open Access Journals (Sweden)

    Dick Jeffrey M

    2008-10-01

    Full Text Available Abstract Background Proteins of various compositions are required by organisms inhabiting different environments. The energetic demands for protein formation are a function of the compositions of proteins as well as geochemical variables including temperature, pressure, oxygen fugacity and pH. The purpose of this study was to explore the dependence of metastable equilibrium states of protein systems on changes in the geochemical variables. Results A software package called CHNOSZ implementing the revised Helgeson-Kirkham-Flowers (HKF equations of state and group additivity for ionized unfolded aqueous proteins was developed. The program can be used to calculate standard molal Gibbs energies and other thermodynamic properties of reactions and to make chemical speciation and predominance diagrams that represent the metastable equilibrium distributions of proteins. The approach takes account of the chemical affinities of reactions in open systems characterized by the chemical potentials of basis species. The thermodynamic database included with the package permits application of the software to mineral and other inorganic systems as well as systems of proteins or other biomolecules. Conclusion Metastable equilibrium activity diagrams were generated for model cell-surface proteins from archaea and bacteria adapted to growth in environments that differ in temperature and chemical conditions. The predicted metastable equilibrium distributions of the proteins can be compared with the optimal growth temperatures of the organisms and with geochemical variables. The results suggest that a thermodynamic assessment of protein metastability may be useful for integrating bio- and geochemical observations.

  1. BEANS - a software package for distributed Big Data analysis

    CERN Document Server

    Hypki, Arkadiusz

    2016-01-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.

  2. QuickDirect - Payload Control Software Template Package Project

    Data.gov (United States)

    National Aeronautics and Space Administration — To address the need to quickly, cost-effectively and reliably develop software to control science instruments deployed on spacecraft, QuickFlex proposes to create a...

  3. The quality and testing PH-SFT infrastructure for the external LHC software packages deployment

    CERN Document Server

    CERN. Geneva; MENDEZ LORENZO, Patricia; MATO VILA, Pere

    2015-01-01

    The PH-SFT group is responsible for the build, test, and deployment of the set of external software packages used by the LHC experiments. This set includes ca. 170 packages including Grid packages and Montecarlo generators provided for different versions. A complete build structure has been established to guarantee the quality of the packages provided by the group. This structure includes an experimental build and three daily nightly builds, each of them dedicated to a specific ROOT version including v6.02, v6.04, and the master. While the former build is dedicated to the test of new packages, versions and dependencies (basically SFT internal used), the three latter ones are the responsible for the deployment to AFS of the set of stable and well tested packages requested by the LHC experiments so they can apply their own builds on top. In all cases, a c...

  4. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  5. Development of a software package for solid-angle calculations using the Monte Carlo method

    Science.gov (United States)

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-02-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.

  6. A Novel Software Evolution Model Based on Software Networks

    Science.gov (United States)

    Pan, Weifeng; Li, Bing; Ma, Yutao; Liu, Jing

    Many published papers analyzed the forming mechanisms and evolution laws of OO software systems from software reuse, software pattern, etc. There, however, have been fewer models so far merely built on the software components such as methods, classes, etc. and their interactions. In this paper, a novel Software Evolution Model based on Software Networks (called SEM-SN) is proposed. It uses software network at class level to represent software systems, and uses software network’s dynamical generating process to simulate activities in real software development process such as new classes’ dynamical creations and their dynamical interactions with already existing classes. It also introduces the concept of node/edge ageing to describe the decaying of classes with time. Empirical results on eight open-source Object-Oriented (OO) software systems demonstrate that SCM-SN roughly describes the evolution process of software systems and the emergence of their complex network characteristics.

  7. MEEP: A flexible free-software package for electromagnetic simulations by the FDTD method

    Science.gov (United States)

    Oskooi, Ardavan F.; Roundy, David; Ibanescu, Mihai; Bermel, Peter; Joannopoulos, J. D.; Johnson, Steven G.

    2010-03-01

    This paper describes Meep, a popular free implementation of the finite-difference time-domain (FDTD) method for simulating electromagnetism. In particular, we focus on aspects of implementing a full-featured FDTD package that go beyond standard textbook descriptions of the algorithm, or ways in which Meep differs from typical FDTD implementations. These include pervasive interpolation and accurate modeling of subpixel features, advanced signal processing, support for nonlinear materials via Padé approximants, and flexible scripting capabilities. Program summaryProgram title: Meep Catalogue identifier: AEFU_v1_0 Program summary URL::http://cpc.cs.qub.ac.uk/summaries/AEFU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL No. of lines in distributed program, including test data, etc.: 151 821 No. of bytes in distributed program, including test data, etc.: 1 925 774 Distribution format: tar.gz Programming language: C++ Computer: Any computer with a Unix-like system and a C++ compiler; optionally exploits additional free software packages: GNU Guile [1], libctl interface library [2], HDF5 [3], MPI message-passing interface [4], and Harminv filter-diagonalization [5]. Developed on 2.8 GHz Intel Core 2 Duo. Operating system: Any Unix-like system; developed under Debian GNU/Linux 5.0.2. RAM: Problem dependent (roughly 100 bytes per pixel/voxel) Classification: 10 External routines: Optionally exploits additional free software packages: GNU Guile [1], libctl interface library [2], HDF5 [3], MPI message-passing interface [4], and Harminv filter-diagonalization [5] (which requires LAPACK and BLAS linear-algebra software [6]). Nature of problem: Classical electrodynamics Solution method: Finite-difference time-domain (FDTD) method Running time: Problem dependent (typically about 10 ns per pixel per timestep) References:[1] GNU Guile, http://www.gnu.org/software/guile[2] Libctl, http

  8. Dynamic modelling of packaging material flow systems.

    Science.gov (United States)

    Tsiliyannis, Christos A

    2005-04-01

    A dynamic model has been developed for reused and recycled packaging material flows. It allows a rigorous description of the flows and stocks during the transition to new targets imposed by legislation, product demand variations or even by variations in consumer discard behaviour. Given the annual reuse and recycle frequency and packaging lifetime, the model determines all packaging flows (e.g., consumption and reuse) and variables through which environmental policy is formulated, such as recycling, waste and reuse rates and it identifies the minimum number of variables to be surveyed for complete packaging flow monitoring. Simulation of the transition to the new flow conditions is given for flows of packaging materials in Greece, based on 1995--1998 field inventory and statistical data.

  9. Pharmacokinetic software for the health sciences: choosing the right package for teaching purposes.

    Science.gov (United States)

    Charles, B G; Duffull, S B

    2001-01-01

    Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.

  10. Evaluation of open source data mining software packages

    Science.gov (United States)

    Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt

    2009-01-01

    Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...

  11. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  12. First Release of Gauss-Legendre Sky Pixelization (GLESP) software package for CMB analysis

    CERN Document Server

    Doroshkevich, A G; Verkhodanov, O V; Novikov, D I; Turchaninov, V I; Novikov, I D; Christensen, P R; Chiang, L Y

    2005-01-01

    We report the release of the Gauss--Legendre Sky Pixelization (GLESP) software package version 1.0. In this report we present the main features and functions for processing and manipulation of sky signals. Features for CMB polarization is underway and to be incorporated in a future release. Interested readers can visit http://www.glesp.nbi.dk (www.glesp.nbi.dk) and register for receiving the package.

  13. Open Source Scanning Probe Microscopy Control Software Package Gxsm

    Energy Technology Data Exchange (ETDEWEB)

    Zahl P.; Wagner, T.; Moller, R.; Klust, A.

    2009-08-10

    Gxsm is a full featured and modern scanning probe microscopy (SPM) software. It can be used for powerful multidimensional image/data processing, analysis, and visualization. Connected toan instrument, it is operating many different avors of SPM, e.g., scanning tunneling microscopy(STM) and atomic force microscopy (AFM) or in general two-dimensional multi channel data acquisition instruments. The Gxsm core can handle different data types, e.g., integer and oating point numbers. An easily extendable plug-in architecture provides many image analysis and manipulation functions. A digital signal processor (DSP) subsystem runs the feedback loop, generates the scanning signals and acquires the data during SPM measurements. The programmable Gxsm vector probe engine performs virtually any thinkable spectroscopy and manipulation task, such as scanning tunneling spectroscopy (STS) or tip formation. The Gxsm software is released under the GNU general public license (GPL) and can be obtained via the Internet.

  14. A Relative Comparison of Leading Supply Chain Management Software Packages

    OpenAIRE

    Zhongxian Wang; Ruiliang Yan; Kimberly Hollister; Ruben Xing

    2009-01-01

    Supply Chain Management (SCM) has proven to be an effective tool that aids companies in the development of competitive advantages. SCM Systems are relied on to manage warehouses, transportation, trade logistics and various other issues concerning the coordinated movement of products and services from suppliers to customers. Although in today’s fast paced business environment, numerous supply chain solution tools are readily available to companies, choosing the right SCM software is not an e...

  15. Development of a software package for solid-angle calculations using the Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jie, E-mail: zhangjie_scu@163.com [Key Laboratory for Neutron Physics of Chinese Academy of Engineering Physics, Institute of Nuclear Physics and Chemistry, Mianyang 621900 (China); College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Chen, Xiulian [College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Zhang, Changsheng [Key Laboratory for Neutron Physics of Chinese Academy of Engineering Physics, Institute of Nuclear Physics and Chemistry, Mianyang 621900 (China); Li, Gang [College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Xu, Jiayun, E-mail: xjy@scu.edu.cn [College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Sun, Guangai [Key Laboratory for Neutron Physics of Chinese Academy of Engineering Physics, Institute of Nuclear Physics and Chemistry, Mianyang 621900 (China)

    2014-02-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C{sup ++}, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4. -- Highlights: • This software package (SAC) can give accurate solid-angle values. • SAC calculate solid angles using the Monte Carlo method and it has higher computation speed than Geant4. • A simple but effective variance reduction technique which was put forward by the authors has been applied in SAC. • A visualization function and a graphical user interface are also integrated in SAC.

  16. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  17. Software package as an information center product. [Activities of Argonne Code Center

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M. K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables. (RWR)

  18. Mathematical Model and Programming in VBA Excel for Package Calculation

    Directory of Open Access Journals (Sweden)

    João Daniel Reis Lessa

    2016-05-01

    Full Text Available The industrial logistics is a fundamental pillar for the survival of companies in the actual increasingly competitive market. It is not exclusively about controlling the flow of external material between suppliers and the company, but for developing a detailed study of how to plan, control, handle and package those materials as well. Logistics activities must ensure the maximum efficiency in using corporate resources once they do not add value to the final product. The creation of a logistic plan, for each piece of the company’s production, has to adapt the demand parameters, seasonal or not, in the timeline. Thus, the definition of packaging (transportation and consumption must adjust in accordance with the demand, in order to allow the logistic planning to work, constantly, with order of economy batches. The packaging calculation for each part in every demand can become well complicated due to the large amount of parts in the production process. Automating the calculation process for choosing the right package for each piece is an effective method in logistics planning. This article will expose a simple and practical mathematical model for automating the packaging calculation and a logic program, created in Visual Basic language in the Excel software, used for creating graphic designs that show how the packages are being filled.

  19. Energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-01-01

    Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...

  20. Development on the Calculation Software Package of the Contribution Rate of Mechanization in Agriculture

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper introduces a software specially in calculating the contribution rate of machanization in agriculture by usng economy math method ,computer technology and Visual Basic 6. 0 version. The software package has friendly interface,simple operating way and accurate, feasible calculating method. It greatly changes the condition in the past which had considerable lots of data and miscellaneous and trivial methods,which were even hard to seek answer. So it has very high practicl value.

  1. User’s Manual for the Simulation of Energy Consumption and Emissions from Rail Traffic Software Package

    DEFF Research Database (Denmark)

    Cordiero, Tiago M.; Lindgreen, Erik Bjørn Grønning; Sorenson, Spencer C

    2005-01-01

    The ARTEMIS rail emissions model was implemented in a Microsoft Excel software package that includes data from the GISCO database on railway traffic. This report is the user’s manual for the aforementioned software that includes information on how to run the program and an overview on how...... the program works. In addition, this manual provides a guide for the user to be able to update the included database itself.This program is provided as a free, open-source program that the user can manipulate to fit the desired goals and update it as new data is gathered. The program is comprised of a set...

  2. PKgraph: an R package for graphically diagnosing population pharmacokinetic models.

    Science.gov (United States)

    Sun, Xiaoyong; Wu, Kai; Cook, Dianne

    2011-12-01

    Population pharmacokinetic (PopPK) modeling has become increasing important in drug development because it handles unbalanced design, sparse data and the study of individual variation. However, the increased complexity of the model makes it more of a challenge to diagnose the fit. Graphics can play an important and unique role in PopPK model diagnostics. The software described in this paper, PKgraph, provides a graphical user interface for PopPK model diagnosis. It also provides an integrated and comprehensive platform for the analysis of pharmacokinetic data including exploratory data analysis, goodness of model fit, model validation and model comparison. Results from a variety of modeling fitting software, including NONMEM, Monolix, SAS and R, can be used. PKgraph is programmed in R, and uses the R packages lattice, ggplot2 for static graphics, and rggobi for interactive graphics.

  3. BaitFisher: A Software Package for Multispecies Target DNA Enrichment Probe Design.

    Science.gov (United States)

    Mayer, Christoph; Sann, Manuela; Donath, Alexander; Meixner, Martin; Podsiadlowski, Lars; Peters, Ralph S; Petersen, Malte; Meusemann, Karen; Liere, Karsten; Wägele, Johann-Wolfgang; Misof, Bernhard; Bleidorn, Christoph; Ohl, Michael; Niehuis, Oliver

    2016-07-01

    Target DNA enrichment combined with high-throughput sequencing technologies is a powerful approach to probing a large number of loci in genomes of interest. However, software algorithms that explicitly consider nucleotide sequence information of target loci in multiple reference species for optimizing design of target enrichment baits to be applicable across a wide range of species have not been developed. Here we present an algorithm that infers target DNA enrichment baits from multiple nucleotide sequence alignments. By applying clustering methods and the combinatorial 1-center sequence optimization to bait design, we are able to minimize the total number of baits required to efficiently probe target loci in multiple species. Consequently, more loci can be probed across species with a given number of baits. Using transcript sequences of 24 apoid wasps (Hymenoptera: Crabronidae, Sphecidae) from the 1KITE project and the gene models of Nasonia vitripennis, we inferred 57,650, 120-bp-long baits for capturing 378 coding sequence sections of 282 genes in apoid wasps. Illumina reduced-representation library sequencing confirmed successful enrichment of the target DNA when applying these baits to DNA of various apoid wasps. The designed baits furthermore enriched a major fraction of the target DNA in distantly related Hymenoptera, such as Formicidae and Chalcidoidea, highlighting the baits' broad taxonomic applicability. The availability of baits with broad taxonomic applicability is of major interest in numerous disciplines, ranging from phylogenetics to biodiversity monitoring. We implemented our new approach in a software package, called BaitFisher, which is open source and freely available at https://github.com/cmayer/BaitFisher-package.git. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. CMIP: a software package capable of reconstructing genome-wide regulatory networks using gene expression data.

    Science.gov (United States)

    Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang

    2016-12-23

    A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .

  5. PKQuest_Java: free, interactive physiologically based pharmacokinetic software package and tutorial

    Directory of Open Access Journals (Sweden)

    Levitt David G

    2009-08-01

    Full Text Available Abstract Background Physiologically based pharmacokinetics (PBPK uses a realistic organ model to describe drug kinetics. The blood-tissue exchange of each organ is characterized by its volume, perfusion, metabolism, capillary permeability and blood/tissue partition coefficient. PBPK applications require both sophisticated mathematical modeling software and a reliable complete set of physiological parameters. Currently there are no software packages available that combine ease of use with the versatility that is required of a general PBPK program. Findings The program is written in Java and is available for free download at http://www.pkquest.com/. Included in the download is a detailed tutorial that discusses the pharmacokinetics of 6 solutes (D2O, amoxicillin, desflurane, propofol, ethanol and thiopental illustrated using experimental human pharmacokinetic data. The complete PBPK description for each solute is stored in Excel spreadsheets that are included in the download. The main features of the program are: 1 Intuitive and versatile interactive interface; 2 Absolute and semi-logarithmic graphical output; 3 Pre-programmed optimized human parameter data set (but, arbitrary values can be input; 4 Time dependent changes in the PBPK parameters; 5 Non-linear parameter optimization; 6 Unique approach to determine the oral "first pass metabolism" of non-linear solutes (e.g. ethanol; 7 Pulmonary perfusion/ventilation heterogeneity for volatile solutes; 8 Input and output of Excel spreadsheet data; 9 Antecubital vein sampling. Conclusion PKQuest_Java is a free, easy to use, interactive PBPK software routine. The user can either directly use the pre-programmed optimized human or rat data set, or enter an arbitrary data set. It is designed so that drugs that are classified as "extracellular" or "highly fat soluble" do not require information about tissue/blood partition coefficients and can be modeled by a minimum of user input parameters. PKQuest

  6. PKQuest_Java: free, interactive physiologically based pharmacokinetic software package and tutorial.

    Science.gov (United States)

    Levitt, David G

    2009-08-05

    Physiologically based pharmacokinetics (PBPK) uses a realistic organ model to describe drug kinetics. The blood-tissue exchange of each organ is characterized by its volume, perfusion, metabolism, capillary permeability and blood/tissue partition coefficient. PBPK applications require both sophisticated mathematical modeling software and a reliable complete set of physiological parameters. Currently there are no software packages available that combine ease of use with the versatility that is required of a general PBPK program. The program is written in Java and is available for free download at http://www.pkquest.com/. Included in the download is a detailed tutorial that discusses the pharmacokinetics of 6 solutes (D2O, amoxicillin, desflurane, propofol, ethanol and thiopental) illustrated using experimental human pharmacokinetic data. The complete PBPK description for each solute is stored in Excel spreadsheets that are included in the download. The main features of the program are: 1) Intuitive and versatile interactive interface; 2) Absolute and semi-logarithmic graphical output; 3) Pre-programmed optimized human parameter data set (but, arbitrary values can be input); 4) Time dependent changes in the PBPK parameters; 5) Non-linear parameter optimization; 6) Unique approach to determine the oral "first pass metabolism" of non-linear solutes (e.g. ethanol); 7) Pulmonary perfusion/ventilation heterogeneity for volatile solutes; 8) Input and output of Excel spreadsheet data; 9) Antecubital vein sampling. PKQuest_Java is a free, easy to use, interactive PBPK software routine. The user can either directly use the pre-programmed optimized human or rat data set, or enter an arbitrary data set. It is designed so that drugs that are classified as "extracellular" or "highly fat soluble" do not require information about tissue/blood partition coefficients and can be modeled by a minimum of user input parameters. PKQuest_Java, along with the included tutorial, could be

  7. Comparison of four software packages for CT lung volumetry in healthy individuals

    Energy Technology Data Exchange (ETDEWEB)

    Nemec, Stefan F. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Molinari, Francesco [Centre Hospitalier Regional Universitaire de Lille, Department of Radiology, Lille (France); Dufresne, Valerie [CHU de Charleroi - Hopital Vesale, Pneumologie, Montigny-le-Tilleul (Belgium); Gosset, Natacha [CHU Tivoli, Service d' Imagerie Medicale, La Louviere (Belgium); Silva, Mario; Bankier, Alexander A. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States)

    2015-06-01

    To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. (orig.)

  8. Accuracy of Noninvasive Coronary Stenosis Quantification of Different Commercially Available Dedicated Software Packages

    NARCIS (Netherlands)

    Dikkers, Riksta; Willems, Tineke P.; de Jonge, Gonda J.; Marquering, Henk A.; Greuter, Marcel J. W.; van Ooijen, Peter M. A.; van der Weide, Marijke C. Jansen; Oudkerk, Matthijs

    2009-01-01

    Purpose: The purpose of this study was to investigate the noninvasive quantification of coronary artery stenosis using cardiac software packages and vessel phantoms with known stenosis severity. Materials and Methods: Four different sizes of vessel phantoms were filled with contrast agent and

  9. Manual of spIds, a software package for parameter identification in dynamic systems

    NARCIS (Netherlands)

    Everaars, C.T.H.; Hemker, P.W.; Stortelder, W.J.H.

    1995-01-01

    This report contains the manual of spIds, version 1.0, a software package for parameter identification in dynamic systems. SpIdslabel{ab:spIds is an acronym of underline{simulation and underline{parameter underline{identification in underline{dynamic underline {systems. It can be applied on wide var

  10. A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.

    Science.gov (United States)

    McConkie, George W.; And Others

    A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…

  11. PyPedal, an open source software package for pedigree analysis

    Science.gov (United States)

    The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...

  12. Software verification and validation for commercial statistical packages utilized by the statistical consulting section of SRTC

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.B.

    2000-03-22

    The purpose of this report is to provide software verification and validation for the statistical packages used by the Statistical Consulting Section (SCS) of the Savannah River Technology Center. The need for this verification and validation stems from the requirements of the Quality Assurance programs that are frequently applicable to the work conducted by SCS. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore the software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are reevaluated using these new tools, this report is to be revised to address their verification and validation.

  13. INSPECT: A graphical user interface software package for IDARC-2D

    Science.gov (United States)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  14. Stochastic Runge-Kutta Software Package for Stochastic Differential Equations

    CERN Document Server

    Gevorkyan, M N; Korolkova, A V; Kulyabov, D S; Sevastyanov, L A

    2016-01-01

    As a result of the application of a technique of multistep processes stochastic models construction the range of models, implemented as a self-consistent differential equations, was obtained. These are partial differential equations (master equation, the Fokker--Planck equation) and stochastic differential equations (Langevin equation). However, analytical methods do not always allow to research these equations adequately. It is proposed to use the combined analytical and numerical approach studying these equations. For this purpose the numerical part is realized within the framework of symbolic computation. It is recommended to apply stochastic Runge--Kutta methods for numerical study of stochastic differential equations in the form of the Langevin. Under this approach, a program complex on the basis of analytical calculations metasystem Sage is developed. For model verification logarithmic walks and Black--Scholes two-dimensional model are used. To illustrate the stochastic "predator--prey" type model is us...

  15. Developing a new software package for PSF estimation and fitting of adaptive optics images

    Science.gov (United States)

    Schreiber, Laura; Diolaiti, Emiliano; Sollima, Antonio; Arcidiacono, Carmelo; Bellazzini, Michele; Ciliegi, Paolo; Falomo, Renato; Foppiani, Italo; Greggio, Laura; Lanzoni, Barbara; Lombini, Matteo; Montegriffo, Paolo; Dalessandro, Emanuele; Massari, Davide

    2012-07-01

    Adaptive Optics (AO) images are characterized by structured Point Spread Function (PSF), with sharp core and extended halo, and by significant variations across the field of view. In order to enable the extraction of high-precision quantitative information and improve the scientific exploitation of AO data, efforts in the PSF modeling and in the integration of suitable models in a code for image analysis are needed. We present the current status of a study on the modeling of AO PSFs based on observational data taken with present telescopes (VLT and LBT). The methods under development include parametric models and hybrid (i.e. analytical / numerical) models adapted to various types of PSFs that can show up in AO images. The specific features of AO data, such as the mainly radial variation of the PSF with respect to the guide star position in single-reference AO, are taken into account as much as possible. The final objective of this project is the development of a flexible software package, based on the Starfinder code (Diolaiati et Al 2000), specifically dedicated to the PSF estimation and to the astrometric and photometric analysis of AO images with complex and spatially variable PSF.

  16. A software algorithm/package for control loop configuration and eco-efficiency.

    Science.gov (United States)

    Munir, M T; Yu, W; Young, B R

    2012-11-01

    Software is a powerful tool to help us analyze industrial information and control processes. In this paper, we will show our recently development of a software algorithm/package which can help us select the more eco-efficient control configuration. Nowadays, the eco-efficiency of all industrial processes/plants has become more and more important; engineers need to find a way to integrate control loop configuration and measurements of eco-efficiency. The exergy eco-efficiency factor; a new measure of eco-efficiency for control loop configuration has been developed. This software algorithm/package will combine a commercial simulator, VMGSim, and Excel together to calculate the exergy eco-efficiency factor.

  17. Modeling Software Processes and Artifacts

    NARCIS (Netherlands)

    van den Berg, Klaas; Bosch, Jan; Mitchell, Stuart

    1997-01-01

    The workshop on Modeling Software Processes and Artifacts explored the application of object technology in process modeling. After the introduction and the invited lecture, a number of participants presented their position papers. First, an overview is given on some background work, and the aims, as

  18. Economic tour package model using heuristic

    Science.gov (United States)

    Rahman, Syariza Abdul; Benjamin, Aida Mauziah; Bakar, Engku Muhammad Nazri Engku Abu

    2014-07-01

    A tour-package is a prearranged tour that includes products and services such as food, activities, accommodation, and transportation, which are sold at a single price. Since the competitiveness within tourism industry is very high, many of the tour agents try to provide attractive tour-packages in order to meet tourist satisfaction as much as possible. Some of the criteria that are considered by the tourist are the number of places to be visited and the cost of the tour-packages. Previous studies indicate that tourists tend to choose economical tour-packages and aiming to visit as many places as they can cover. Thus, this study proposed tour-package model using heuristic approach. The aim is to find economical tour-packages and at the same time to propose as many places as possible to be visited by tourist in a given geographical area particularly in Langkawi Island. The proposed model considers only one starting point where the tour starts and ends at an identified hotel. This study covers 31 most attractive places in Langkawi Island from various categories of tourist attractions. Besides, the allocation of period for lunch and dinner are included in the proposed itineraries where it covers 11 popular restaurants around Langkawi Island. In developing the itinerary, the proposed heuristic approach considers time window for each site (hotel/restaurant/place) so that it represents real world implementation. We present three itineraries with different time constraints (1-day, 2-day and 3-day tour-package). The aim of economic model is to minimize the tour-package cost as much as possible by considering entrance fee of each visited place. We compare the proposed model with our uneconomic model from our previous study. The uneconomic model has no limitation to the cost with the aim to maximize the number of places to be visited. Comparison between the uneconomic and economic itinerary has shown that the proposed model have successfully achieved the objective that

  19. Dynamic modelling and PID loop control of an oil-injected screw compressor package

    Science.gov (United States)

    Poli, G. W.; Milligan, W. J.; McKenna, P.

    2017-08-01

    A significant amount of time is spent tuning the PID (Proportional, Integral and Derivative) control loops of a screw compressor package due to the unique characteristics of the system. Common mistakes incurred during the tuning of a PID control loop include improper PID algorithm selection and unsuitable tuning parameters of the system resulting in erratic and inefficient operation. This paper details the design and development of software that aims to dynamically model the operation of a single stage oil injected screw compressor package deployed in upstream oil and gas applications. The developed software will be used to assess and accurately tune PID control loops present on the screw compressor package employed in controlling the oil pressures, temperatures and gas pressures, in a bid to improve control of the operation of the screw compressor package. Other applications of the modelling software will include its use as an evaluation tool that can estimate compressor package performance during start up, shutdown and emergency shutdown processes. The paper first details the study into the fundamental operational characteristics of each of the components present on the API 619 screw compressor package and then discusses the creation of a dynamic screw compressor model within the MATLAB/Simulink software suite. The paper concludes by verifying and assessing the accuracy of the created compressor model using data collected from physical screw compressor packages.

  20. Software Validation via Model Animation

    Science.gov (United States)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  1. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  2. Motorola Secure Software Development Model

    Directory of Open Access Journals (Sweden)

    Francis Mahendran

    2008-08-01

    Full Text Available In today's world, the key to meeting the demand for improved security is to implement repeatable processes that reliably deliver measurably improved security. While many organizations have announced efforts to institutionalize a secure software development process, there is little or no industry acceptance for a common process improvement framework for secure software development. Motorola has taken the initiative to develop such a framework, and plans to share this with the Software Engineering Institute for possible inclusion into its Capability Maturity Model Integration (CMMI®. This paper will go into the details of how Motorola is addressing this issue. The model that is being developed is designed as an extension of the existing CMMI structure. The assumption is that the audience will have a basic understanding of the SEI CMM® / CMMI® process framework. The paper will not describe implementation details of a security process model or improvement framework, but will address WHAT security practices are required for a company with many organizations operating at different maturity levels. It is left to the implementing organization to answer the HOW, WHEN, WHO and WHERE aspects. The paper will discuss how the model is being implemented in the Motorola Software Group.

  3. Bill2d -- a software package for classical two-dimensional Hamiltonian systems

    CERN Document Server

    Solanpää, Janne; Räsänen, Esa

    2016-01-01

    We present Bill2d, a modern and efficient C++ package for classical simulations of two-dimensional Hamiltonian systems. Bill2d can be used for various billiard and diffusion problems with one or more charged particles with interactions, different external potentials, an external magnetic field, periodic and open boundaries, etc. The software package can also calculate many key quantities in complex systems such as Poincar\\'e sections, survival probabilities, and diffusion coefficients. While aiming at a large class of applicable systems, the code also strives for ease-of-use, efficiency, and modularity for the implementation of additional features. The package comes along with a user guide, a developer's manual, and a documentation of the application program interface (API).

  4. BILL2D - A software package for classical two-dimensional Hamiltonian systems

    Science.gov (United States)

    Solanpää, J.; Luukko, P. J. J.; Räsänen, E.

    2016-02-01

    We present BILL2D, a modern and efficient C++ package for classical simulations of two-dimensional Hamiltonian systems. BILL2D can be used for various billiard and diffusion problems with one or more charged particles with interactions, different external potentials, an external magnetic field, periodic and open boundaries, etc. The software package can also calculate many key quantities in complex systems such as Poincaré sections, survival probabilities, and diffusion coefficients. While aiming at a large class of applicable systems, the code also strives for ease-of-use, efficiency, and modularity for the implementation of additional features. The package comes along with a user guide, a developer's manual, and a documentation of the application program interface (API).

  5. Novel applications of the x-ray tracing software package McXtrace

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Nielsen, Martin Meedom; Haldrup, Kristoffer

    2014-01-01

    We will present examples of applying the X-ray tracing software package McXtrace to different kinds of X-ray scattering experiments. In particular we will be focusing on time-resolved type experiments. Simulations of full scale experiments are particularly useful for this kind, especially when...... some of the issues encountered. Generally more than one or all of these effects are present at once. Simulations can in these cases be used to identify distinct footprints of such distortions and thus give the experimenter a means of deconvoluting them from the signal. We will present a study...... of this kind along with the newest developments of the McXtrace software package....

  6. A welding document management software package based on a Client/Server structure

    Institute of Scientific and Technical Information of China (English)

    魏艳红; 杨春利; 王敏

    2003-01-01

    According to specifications for Welding Procedure Qualification of ASME IX Section and Chinese code, JB 4708-2000, a software package for managing welding documents has been rebuilt. Consequently, the new software package can be used in a Limited Area Network (LAN) with 4 different levels of authorities for different users. Therefore, the welding documents, including DWPS (Design for Welding Procedure Specifications), PQRs (Procedure Qualification Records) and WPS (Welding Procedure Specifications) can be shared within a company. At the same time, the system provides users various functions such as browsing, copying, editing, searching and printing records, and helps users to make decision of whether a new PQR test is necessary or not according to the codes above as well. Furthermore, super users can also browse the history of record modification and retrieve the records when needed.

  7. High performance computing software package for multitemporal Remote-Sensing computations

    Directory of Open Access Journals (Sweden)

    Asaad Chahboun

    2010-10-01

    Full Text Available With the huge satellite data actually stored, remote sensing multitemporal study is nowadays one of the most challenging fields of computer science. The multicore hardware support and Multithreading can play an important role in speeding up algorithm computations. In the present paper, a software package (called Multitemporal Software Package for Satellite Remote sensing data (MSPSRS has been developed for the multitemporal treatment of satellite remote sensing images in a standard format. Due to portability intend, the interface was developed using the QT application framework and the core wasdeveloped integrating C++ classes. MSP.SRS can run under different operating systems (i.e., Linux, Mac OS X, Windows, Embedded Linux, Windows CE, etc.. Final benchmark results, using multiple remote sensing biophysical indices, show a gain up to 6X on a quad core i7 personal computer.

  8. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel;

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the Geo......Steiner approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  9. Application of Metafor Package in R Software%R软件Metafor程序包在Meta分析中的应用

    Institute of Scientific and Technical Information of China (English)

    董圣杰; 曾宪涛; 郭毅

    2012-01-01

    R software is a free and powerful statistical tool, including Metafor, Meta as well as Rmeta packages, all of which could conduct meta-analysis. Metafor package provides functions for meta-analyses which include analysis of continuous and categorical data,meta-regression, cumulative meta-analysis as well as test for funnel plot asymmetry. The package can also draw various plots, such as forest plot, funnel plot, radial plot and so forth. Mixed-effects models (involving single or multiple categorical and/or continuous moderates) can only be fitted with Metafor packages. Advanced methods for testing model coefficients and confidence intervals are also implemented only in this package. This article introduces detailed operation steps of Metafor package for meta-analysis using cases.%R软件是一款免费使用且功能强大的统计软件,常用的Meta分析程序包有Metafor、Meta、Rrneta等.Metafor程序包可以方便地进行Meta分析,包括二分类及连续性变量的Meta分析、Meta回归、累积Meta分析及对发表偏倚的Begg's检验和Egger's检验等,以及绘制森林图、漏斗图、星状图、拉贝图、Q-Q正态分位图等图形.此外,Metafor程序包是R软件Meta分析程序包中唯一可以进行混合效应模型(包括单个、多个分类或连续性变量)拟合运算的,还可以检验模型系数并获得可信区间.本文结合实例对应用Metafor程序包进行Meta分析的具体操作方法进行详细介绍.

  10. Determination of stress-strain state of the wooden church log walls with software package

    Directory of Open Access Journals (Sweden)

    Chulkova Anastasia

    2016-01-01

    Full Text Available The restoration of architectural monuments is going on all over the world today. The main aim of restoration is the renewal of stable functioning of building constructions in normal state. In this article, we have tried to figure out with special software the bearing capacity of log cabins of the Church of Transfiguration on Kizhi island. As shown in research results, determination of stress-strain stage with software package is necessary for the bearing capacity computation as well as field tests.

  11. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  12. Behavior models for software architecture

    OpenAIRE

    Auguston, Mikhail

    2014-01-01

    Approved for public release; distribution is unlimited. Approved for public release; distribution is unlimited Monterey Phoenix (MP) is an approach to formal software system architecture specification based on behavior models. Architecture modeling focuses not only on the activities and interactions within the system, but also on the interactions between the system and its environment, providing an abstraction for interaction specification. The behavior of the system is defined as a set...

  13. Affect 4.0: a free software package for implementing psychological and psychophysiological experiments.

    Science.gov (United States)

    Spruyt, Adriaan; Clarysse, Jeroen; Vansteenwegen, Debora; Baeyens, Frank; Hermans, Dirk

    2010-01-01

    We describe Affect 4.0, a user-friendly software package for implementing psychological and psychophysiological experiments. Affect 4.0 can be used to present visual, acoustic, and/or tactile stimuli in highly complex (i.e., semirandomized and response-contingent) sequences. Affect 4.0 is capable of registering response latencies and analog behavioral input with millisecond accuracy. Affect 4.0 is available free of charge.

  14. Simulation of combustion products flow in the Laval nozzle in the software package SIFIN

    Science.gov (United States)

    Alhussan, K. A.; Teterev, A. V.

    2017-07-01

    Developed specialized multifunctional software package SIFIN (Simulation of Internal Flow In the Nozzle) designed for the numerical simulation of the flow of products of combustion in a Laval nozzle. It allows to design the different profiles of the nozzles, to simulate flow of multicomponent media based energy release by burning, to study the effect of swirling flow of products of combustion at the nozzle settings, to investigate the nature of the expiry of the gas jet with varying degrees of pressure ratio.

  15. Software Verification and Validation for Commercial Statistical Packages Utilized by the Statistical Consulting Section of SRTC

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.B.

    2001-01-16

    The purpose of this report is to provide software verification and validation (v and v) for the statistical packages utilized by the Statistical Consulting Section (SCS) of the Savannah River Technology Center (SRTC). The need for this v and v stems from the requirements of the Quality Assurance (QA) programs that are frequently applicable to the work conducted by SCS. This document is designed to comply with software QA requirements specified in the 1Q Manual Quality Assurance Procedure 20-1, Revision 6. Revision 1 of this QA plan adds JMP Version 4 to the family of (commercially-available) statistical tools utilized by SCS. JMP Version 3.2.2 is maintained as a support option due to features unique to this version of JMP that have not as yet been incorporated into Version 4. SCS documents that include JMP output should provide a clear indication of the version or versions of JMP that were used. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore, th e software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are introduced into the Statistical Consulting Section, the appropriate problems from this report are to be re-evaluated, and this report is to be revised to address their verification and validation.

  16. Deconvolution Estimation in Measurement Error Models: The R Package decon

    Directory of Open Access Journals (Sweden)

    Xiao-Feng Wang

    2011-03-01

    Full Text Available Data from many scientific areas often come with measurement error. Density or distribution function estimation from contaminated data and nonparametric regression with errors in variables are two important topics in measurement error models. In this paper, we present a new software package decon for R, which contains a collection of functions that use the deconvolution kernel methods to deal with the measurement error problems. The functions allow the errors to be either homoscedastic or heteroscedastic. To make the deconvolution estimators computationally more efficient in R, we adapt the fast Fourier transform algorithm for density estimation with error-free data to the deconvolution kernel estimation. We discuss the practical selection of the smoothing parameter in deconvolution methods and illustrate the use of the package through both simulated and real examples.

  17. SUPCRTBL: A revised and extended thermodynamic dataset and software package of SUPCRT92

    Science.gov (United States)

    Zimmer, Kurt; Zhang, Yilun; Lu, Peng; Chen, Yanyan; Zhang, Guanru; Dalkilic, Mehmet; Zhu, Chen

    2016-05-01

    The computer-enabled thermodynamic database associated with SUPCRT92 (Johnson et al., 1992) enables the calculation of the standard molal thermodynamic properties of minerals, gases, aqueous species, and reactions for a wide range of temperatures and pressures. However, new data on the thermodynamic properties of both aqueous species and minerals have become available since the database's initial release in 1992 and its subsequent updates. In light of these developments, we have expanded SUPCRT92's thermodynamic dataset and have modified the accompanying computer code for thermodynamic calculations by using newly available properties. The modifications in our new version include: (1) updating the standard state thermodynamic properties for mineral end-members with properties from Holland and Powell (2011) to improve the study of metamorphic petrology and economic geology; (2) adding As-acid, As-metal aqueous species, and As-bearing minerals to improve the study of environmental geology; (3) updating properties for Al-bearing species, SiO2° (aq) and HSiO3- , boehmite, gibbsite, and dawsonite for modeling geological carbon sequestration. The new thermodynamic dataset and the modified SUPCRT92 program were implemented in a software package called SUPCRTBL, which is available online at

  18. Generic domain models in software engineering

    Science.gov (United States)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  19. Modularized seismic full waveform inversion based on waveform sensitivity kernels - The software package ASKI

    Science.gov (United States)

    Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel

    2015-04-01

    We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity

  20. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    Science.gov (United States)

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  1. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  2. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  3. SWISTRACK - AN OPEN SOURCE, SOFTWARE PACKAGE APPLICABLE TO TRACKING OF FISH LOCOMOTION AND BEHAVIOUR

    DEFF Research Database (Denmark)

    Steffensen, John Fleng

    2010-01-01

    , Swistrack can be easily adopted for the tracking offish. Benefits associated with the free software include: • Contrast or marker based tracking enabling tracking of either the whole animal, or tagged marks placed upon the animal • The ability to track multiple tags placed upon an individual animal • Highly...... including swimming speed, acceleration and directionality of movements as well as the examination of locomotory panems during swimming. SWiSlrdL:k, a [n: t; and downloadable software package (available from www.sourceforge.com) is widely used for tracking robots, humans and other animals. Accordingly...... effective background subtraction algorithms and filters ensuring smooth tracking of fish • Application of tags of different colour enables the software to track multiple fish without the problem of track exchange between individuals • Low processing requirements enable tracking in real-time • Further...

  4. A MATLAB Package for Markov Chain Monte Carlo with a Multi-Unidimensional IRT Model

    Directory of Open Access Journals (Sweden)

    Yanyan Sheng

    2008-11-01

    Full Text Available Unidimensional item response theory (IRT models are useful when each item is designed to measure some facet of a unified latent trait. In practical applications, items are not necessarily measuring the same underlying trait, and hence the more general multi-unidimensional model should be considered. This paper provides the requisite information and description of software that implements the Gibbs sampler for such models with two item parameters and a normal ogive form. The software developed is written in the MATLAB package IRTmu2no. The package is flexible enough to allow a user the choice to simulate binary response data with multiple dimensions, set the number of total or burn-in iterations, specify starting values or prior distributions for model parameters, check convergence of the Markov chain, as well as obtain Bayesian fit statistics. Illustrative examples are provided to demonstrate and validate the use of the software package.

  5. Use of GEANT4 programme package for radiation technological modeling problems

    CERN Document Server

    Bratchenko, M I

    2001-01-01

    The results of pilot computer experiments directed toward the application of Geant4 package for modeling of gamma radiation dose distributions in homogeneous phantoms are presented. we demonstrate the potential of the package it self and of the developed add-on software modules for the calculations of three-dimensional absorbed dose distributions taking into account the design and geometry of model irradiators and phantoms.

  6. Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)

    Science.gov (United States)

    Yavuz, Guler; Hambleton, Ronald K.

    2017-01-01

    Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…

  7. Short time demand forecasting in electric power system using ITSM software package; Zastosowanie pakietu ITSM do krotkoterminowej prognozy mocy w systemie energetycznym

    Energy Technology Data Exchange (ETDEWEB)

    Mazurek, P. [Instytut Automatyki Systemow Energetycznych, Wroclaw (Poland)

    1995-07-01

    Application of the ITSM software package is presented together with the used iterative time series modeling methodology. The results obtained for the National Power System 24 hour load forecast were calculated for 200 days with acceptable accuracy. A time required for input data analysis and single forecast calculations was approx. 20 minutes on standard IBM PC. (author). 5 refs., 4 figs., 2 tabs.

  8. QED v 1.0: a software package for quantitative electron diffraction data treatment.

    Science.gov (United States)

    Belletti, D; Calestani, G; Gemmi, M; Migliori, A

    2000-03-01

    A new software package for quantitative electron diffraction data treatment of unknown structures is described. No "a priori" information is required by the package which is able to perform in successive steps the 2-D indexing of digitised diffraction patterns, the extraction of the intensity of the collected reflections and the 3-D indexing of all recorded patterns, giving as results the lattice parameters of the investigated structure and a series of data files (one for each diffraction pattern) containing the measured intensities and the relative e.s.d.s of the 3-D indexed reflections. The software package is mainly conceived for the treatment of diffraction patterns taken with a Gatan CCD Slow-Scan Camera, but it can also deal with generic digitised plates. The program is designed to extract intensity data suitable for structure solution techniques in electron crystallography. The integration routine is optimised for a correct background evaluation, a necessary condition to deal with weak spots of irregular shape and an intensity just above the background.

  9. Flexible Rasch Mixture Models with Package psychomix

    Directory of Open Access Journals (Sweden)

    Hannah Frick

    2012-05-01

    Full Text Available Measurement invariance is an important assumption in the Rasch model and mixture models constitute a flexible way of checking for a violation of this assumption by detecting unobserved heterogeneity in item response data. Here, a general class of Rasch mixture models is established and implemented in R, using conditional maximum likelihood estimation of the item parameters (given the raw scores along with flexible specification of two model building blocks: (1 Mixture weights for the unobserved classes can be treated as model parameters or based on covariates in a concomitant variable model. (2 The distribution of raw score probabilities can be parametrized in two possible ways, either using a saturated model or a specification through mean and variance. The function raschmix( in the R package psychomix provides these models, leveraging the general infrastructure for fitting mixture models in the flexmix package. Usage of the function and its associated methods is illustrated on artificial data as well as empirical data from a study of verbally aggressive behavior.

  10. [download] (969dlmap: An R Package for Mixed Model QTL and Association Analysis

    Directory of Open Access Journals (Sweden)

    B. Emma Huang

    2012-08-01

    Full Text Available dlmap is a software package capable of mapping quantitative trait loci (QTL in a variety of genetic studies. Unlike most other QTL mapping packages, dlmap is built on a linear mixed model platform, and thus can simultaneously handle multiple sources of genetic and environmental variation. Furthermore, it can accommodate both experimental crosses and association mapping populations within a versatile modeling framework. The software implements a mapping algorithm with separate detection and localization stages in a user-friendly manner. It accepts data in various common formats, has a flexible modeling environment, and summarizes results both graphically and numerically.

  11. MedLinac2: a GEANT4 based software package for radiotherapy

    Directory of Open Access Journals (Sweden)

    Barbara Caccia

    2010-06-01

    Full Text Available Dose distribution evaluation in oncological radiotherapy treatments is an outstanding problem that requires sophisticated computing technologies to optimize the clinical results (i.e. increase the dose to the tumour and reduce the dose to the healthy tissues. Nowdays, dose calculation algorithms based on the Monte Carlo method are generally regarded as the most accurate tools for radiotherapy. The flexibility of the GEANT4 (GEometry ANd Tracking Monte Carlo particle transport simulation code allows a wide range of applications, from high-energy to medical physics. In order to disseminate and encourage the use of Monte Carlo method in oncological radiotherapy, a software package based on the GEANT4 Monte Carlo toolkit has been developed. The developed package (MedLinac2 allows to simulate in an adequate flexible way a linear accelerator for radiotherapy and to evaluate the dose distributions.

  12. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    Science.gov (United States)

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  13. Online Rule Generation Software Process Model

    National Research Council Canada - National Science Library

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    .... The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation...

  14. A parallel-pipelining software process model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Software process is a framework for effective and timely delivery of software system. The framework plays a crucial role for software success. However, the development of large-scale software still faces the crisis of high risks, low quality, high costs and long cycle time.This paper proposed a three-phase parallel-pipelining software process model for improving speed and productivity, and reducing software costs and risks without sacrificing software quality. In this model, two strategies were presented. One strategy, based on subsystem-cost priority, Was used to prevent software development cost wasting and to reduce software complexity as well; the other strategy, used for balancing subsystem complexity, was designed to reduce the software complexity in the later development stages. Moreover. The proposed function-detailed and workload-simplified subsystem pipelining software process model presents much higher parallelity than the concurrent incremental model. Finally, the component-based product line technology not only ensures software quality and further reduces cycle time, software costs. And software risks but also sufficiently and rationally utilizes previous software product resources and enhances the competition ability of software development organizations.

  15. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Energy Technology Data Exchange (ETDEWEB)

    Ashraf, H.; Bach, K.S.; Hansen, H. [Copenhagen University, Department of Radiology, Gentofte Hospital, Hellerup (Denmark); Hoop, B. de [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Shaker, S.B.; Dirksen, A. [Copenhagen University, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Prokop, M. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Radboud University Nijmegen, Department of Radiology, Nijmegen (Netherlands); Pedersen, J.H. [Copenhagen University, Department of Cardiothoracic Surgery RT, Rigshospitalet, Copenhagen (Denmark)

    2010-08-15

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  16. The Software Package PAOLAC: an embedment of the analytical code PAOLA within the CAOS problem-solving environment

    Science.gov (United States)

    Carbillet, Marcel; Jolissaint, Laurent; Maire, Anne-Lise

    We present the Software Package PAOLAC (“PAOLA within Caos”) in its first distributed version. This new numerical simulation tool is an embedment of the analytical adaptive optics simulation code PAOLA (“Performance of Adaptive Optics for Large (or Little) Apertures”) within the CAOS problem-solving environment. The main goal of this new tool is to allow an easier and direct comparison between studies performed with the analytical open-loop code PAOLA and studies performed with the end-to-end closed-loop Software Package CAOS (“Code for Adaptive Optics Systems”), with the final scope of better understanding how to take advantage from the two approaches: one analytical allowing extremely quick results on a wide range of cases and the other extremely detailed but with a computational and memory costs which can be impressive. The practical implementation of this embedment is briefly described, showing how this absolutely does not affect any aspect of the original code which is simply directly called from the CAOS global graphical interface through ad hoc modules. A comparison between end-to-end modelling and analytical modelling is hence also initiated, within the specific framework of wide-field adaptive optics at Dome C, Antarctica.

  17. Big Science, Small-Budget Space Experiment Package Aka MISSE-5: A Hardware And Software Perspective

    Science.gov (United States)

    Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan

    2007-01-01

    Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.

  18. MNP: R Package for Fitting the Multinomial Probit Model

    Directory of Open Access Journals (Sweden)

    Kosuke Imai

    2005-05-01

    Full Text Available MNP is a publicly available R package that fits the Bayesian multinomial probit model via Markov chain Monte Carlo. The multinomial probit model is often used to analyze the discrete choices made by individuals recorded in survey data. Examples where the multinomial probit model may be useful include the analysis of product choice by consumers in market research and the analysis of candidate or party choice by voters in electoral studies. The MNP software can also fit the model with different choice sets for each individual, and complete or partial individual choice orderings of the available alternatives from the choice set. The estimation is based on the efficient marginal data augmentation algorithm that is developed by Imai and van Dyk (2005.

  19. Volvo Logistics Corporation Returnable Packaging System : a model for analysing cost savings when switching packaging system

    OpenAIRE

    2008-01-01

    This thesis is a study for analysing costs affected by packaging in a producing industry. The purpose is to develop a model that will calculate and present possible cost savings for the customer by using Volvo Logistics Corporations, VLC’s, returnable packaging instead of other packaging solutions. The thesis is based on qualitative data gained from both theoretical and empirical studies. The methodology for gaining information has been to study theoretical sources such as course literature a...

  20. Diffusion tensor imaging of the median nerve: intra-, inter-reader agreement, and agreement between two software packages.

    Science.gov (United States)

    Guggenberger, Roman; Nanz, Daniel; Puippe, Gilbert; Rufibach, Kaspar; White, Lawrence M; Sussman, Marshall S; Andreisek, Gustav

    2012-08-01

    To assess intra-, inter-reader agreement, and the agreement between two software packages for magnetic resonance diffusion tensor imaging (DTI) measurements of the median nerve. Fifteen healthy volunteers (seven men, eight women; mean age, 31.2 years) underwent DTI of both wrists at 1.5 T. Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) of the median nerve were measured by three readers using two commonly used software packages. Measurements were repeated by two readers after 6 weeks. Intraclass correlation coefficients (ICC) and Bland-Altman analysis were used for statistical analysis. ICCs for intra-reader agreement ranged from 0.87 to 0.99, for inter-reader agreement from 0.62 to 0.83, and between the two software packages from 0.63 to 0.82. Bland-Altman analysis showed no differences for intra- and inter-reader agreement and agreement between software packages. The intra-, inter-reader, and agreement between software packages for DTI measurements of the median nerve were moderate to substantial suggesting that user- and software-dependent factors contribute little to variance in DTI measurements.

  1. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    densities); converting PSDs to order analysis data; extracting harmonics; initializing and simultaneously tuning a harmonic model and a wheel structural model; initializing and tuning a broadband model; and verifying the harmonic/broadband/structural model against the measurement data. Functional operation is through a MATLAB GUI that loads test data, performs the various analyses, plots evaluation data for assessment and refinement of analysis parameters, and exports the data to documentation or downstream analysis code. The harmonic models are defined as specified functions of frequency, typically speed-squared. The reaction wheel structural model is realized as mass, damping, and stiffness matrices (typically from a finite element analysis package) with the addition of a gyroscopic forcing matrix. The broadband noise model is realized as a set of speed-dependent filters. The tuning of the combined model is performed using nonlinear least squares techniques. RWDMES is implemented as a MATLAB toolbox comprising the Fit Manager for performing the model extraction, Data Manager for managing input data and output models, the Gyro Manager for modifying wheel structural models, and the Harmonic Editor for evaluating and tuning harmonic models. This software was validated using data from Goodrich E wheels, and from GSFC Lunar Reconnaissance Orbiter (LRO) wheels. The validation testing proved that RWDMES has the capability to extract accurate disturbance models from flight reaction wheels with minimal user effort.

  2. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  3. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  4. Computer controlled cryo-electron microscopy--TOM² a software package for high-throughput applications.

    Science.gov (United States)

    Korinek, Andreas; Beck, Florian; Baumeister, Wolfgang; Nickell, Stephan; Plitzko, Jürgen M

    2011-09-01

    Automated data acquisition expedites structural studies by electron microscopy and it allows to collect data sets of unprecedented size and consistent quality. In electron tomography it greatly facilitates the systematic exploration of large cellular landscapes and in single particle analysis it allows to generate data sets for an exhaustive classification of coexisting molecular states. Here we describe a novel software philosophy and architecture that can be used for a great variety of automated data acquisition scenarios. Based on our original software package TOM, the new TOM(2) package has been designed in an object-oriented way. The whole program can be seen as a collection of self-sufficient modules with defined relationships acting in a concerted manner. It subdivides data acquisition into a set of hierarchical tasks, bonding data structure and the operations to be performed tightly together. To demonstrate its capacity for high-throughput data acquisition it has been used in conjunction with instrumentation combining the latest technological achievements in electron optics, cryogenics and robotics. Its performance is demonstrated with a single particle analysis case study and with a batch tomography application.

  5. [Development of the software package of the nuclear medicine data processor for education and research].

    Science.gov (United States)

    Maeda, Hisato; Yamaki, Noriyasu; Azuma, Makoto

    2012-01-01

    The objective of this study was to develop a personal computer-based nuclear medicine data processor for education and research in the field of nuclear medicine. We call this software package "Prominence Processor" (PP). Windows of Microsoft Corporation was used as the operating system of this PP, which have 1024 × 768 image resolution and various 63 applications classified into 6 groups. The accuracy was examined for a lot of applications of the PP. For example, in the FBP reconstruction application, there was visually no difference in the image quality as a result of comparing two SPECT images obtained from the PP and GMS-5500A (Toshiba). Moreover, Normalized MSE between both images showed 0.0003. Therefore the high processing accuracy of the FBP reconstruction application was proven as well as other applications. The PP can be used in an arbitrary place if the software package is installed in note PC. Therefore the PP is used to lecture and to practice on an educational site and used for the purpose of the research of the radiological technologist on a clinical site etc. widely now.

  6. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Directory of Open Access Journals (Sweden)

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  7. SWOT Analysis of Software Development Process Models

    Directory of Open Access Journals (Sweden)

    Ashish B. Sasankar

    2011-09-01

    Full Text Available Software worth billions and trillions of dollars have gone waste in the past due to lack of proper techniques used for developing software resulting into software crisis. Historically , the processes of software development has played an important role in the software engineering. A number of life cycle models have been developed in last three decades. This paper is an attempt to Analyze the software process model using SWOT method. The objective is to identify Strength ,Weakness ,Opportunities and Threats of Waterfall, Spiral, Prototype etc.

  8. GMATA: an integrated software package for genome-scale SSR mining, marker development and viewing

    Directory of Open Access Journals (Sweden)

    Xuewen Wang

    2016-09-01

    Full Text Available Simple sequence repeats (SSRs, also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar.

  9. GMATA: An Integrated Software Package for Genome-Scale SSR Mining, Marker Development and Viewing

    Science.gov (United States)

    Wang, Xuewen; Wang, Le

    2016-01-01

    Simple sequence repeats (SSRs), also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA) integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar. PMID:27679641

  10. Software Assurance Using Structured Assurance Case Models.

    Science.gov (United States)

    Rhodes, Thomas; Boland, Frederick; Fong, Elizabeth; Kass, Michael

    2010-01-01

    Software assurance is an important part of the software development process to reduce risks and ensure that the software is dependable and trustworthy. Software defects and weaknesses can often lead to software errors and failures and to exploitation by malicious users. Testing, certification and accreditation have been traditionally used in the software assurance process to attempt to improve software trustworthiness. In this paper, we examine a methodology known as a structured assurance model, which has been widely used for assuring system safety, for its potential application to software assurance. We describe the structured assurance model and examine its application and use for software assurance. We identify strengths and weaknesses of this approach and suggest areas for further investigation and testing.

  11. A new version of Scilab software package for the study of dynamical systems

    Science.gov (United States)

    Bordeianu, C. C.; Felea, D.; Beşliu, C.; Jipa, Al.; Grossu, I. V.

    2009-11-01

    This work presents a new version of a software package for the study of chaotic flows, maps and fractals [1]. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well-known examples are implemented, with the capability of the users inserting their own ODE or iterative equations. New version program summaryProgram title: Chaos v2.0 Catalogue identifier: AEAP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1275 No. of bytes in distributed program, including test data, etc.: 7135 Distribution format: tar.gz Programming language: Scilab 5.1.1. Scilab 5.1.1 should be installed before running the program. Information about the installation can be found at http://wiki.scilab.org/howto/install/windows. Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 150 Megabytes Classification: 6.2 Catalogue identifier of previous version: AEAP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 788 Does the new version supersede the previous version?: Yes Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations for the study of

  12. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  13. Opensource Software for MLR-Modelling of Solar Collectors

    DEFF Research Database (Denmark)

    Bacher, Peder; Perers, Bengt

    2011-01-01

    A first research version is now in operation of a software package for multiple linear regression (MLR) modeling and analysis of solar collectors according to ideas originating all the way from Walletun et. al. (1986), Perers, (1987 and 1993). The tool has been implemented in the free and open...... source program R http://www.r-project.org/. Applications of the software package includes: visual validation, resampling and conversion of data, collector performance testing analysis according to the European Standard EN 12975 (Fischer et al., 2004), statistical validation of results......, and the determination of collector incidence angle modifiers without the need of a mathematical function (Perers, 1997). The paper gives a demonstration with examples of the applications, based on measurements obtained at a test site at DTU in Denmark (Fan et al., 2009). The tested collector is a single glazed large...

  14. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  15. Absolute myocardial flow quantification with {sup 82}Rb PET/CT: comparison of different software packages and methods

    Energy Technology Data Exchange (ETDEWEB)

    Tahari, Abdel K.; Lee, Andy; Rajaram, Mahadevan; Fukushima, Kenji; Lodge, Martin A.; Wahl, Richard L.; Bravo, Paco E. [Divisions of Nuclear Medicine, Johns Hopkins Medical Institutions, Department of Radiology, Baltimore, MD (United States); Lee, Benjamin C. [INVIA Medical Imaging Solutions, Ann Arbor, MI (United States); Ficaro, Edward P. [University of Michigan Health Systems, Ann Arbor, MI (United States); Nekolla, Stephan [Technical University of Munich, Munich (Germany); Klein, Ran; DeKemp, Robert A. [University of Ottawa Heart Institute, Ottawa (Canada); Bengel, Frank M. [Hannover Medical School, Department of Nuclear Medicine, Hannover (Germany)

    2014-01-15

    In clinical cardiac {sup 82}Rb PET, globally impaired coronary flow reserve (CFR) is a relevant marker for predicting short-term cardiovascular events. However, there are limited data on the impact of different software and methods for estimation of myocardial blood flow (MBF) and CFR. Our objective was to compare quantitative results obtained from previously validated software tools. We retrospectively analyzed cardiac {sup 82}Rb PET/CT data from 25 subjects (group 1, 62 ± 11 years) with low-to-intermediate probability of coronary artery disease (CAD) and 26 patients (group 2, 57 ± 10 years; P = 0.07) with known CAD. Resting and vasodilator-stress MBF and CFR were derived using three software applications: (1) Corridor4DM (4DM) based on factor analysis (FA) and kinetic modeling, (2) 4DM based on region-of-interest (ROI) and kinetic modeling, (3) MunichHeart (MH), which uses a simplified ROI-based retention model approach, and (4) FlowQuant (FQ) based on ROI and compartmental modeling with constant distribution volume. Resting and stress MBF values (in milliliters per minute per gram) derived using the different methods were significantly different: using 4DM-FA, 4DM-ROI, FQ, and MH resting MBF values were 1.47 ± 0.59, 1.16 ± 0.51, 0.91 ± 0.39, and 0.90 ± 0.44, respectively (P < 0.001), and stress MBF values were 3.05 ± 1.66, 2.26 ± 1.01, 1.90 ± 0.82, and 1.83 ± 0.81, respectively (P < 0.001). However, there were no statistically significant differences among the CFR values (2.15 ± 1.08, 2.05 ± 0.83, 2.23 ± 0.89, and 2.21 ± 0.90, respectively; P = 0.17). Regional MBF and CFR according to vascular territories showed similar results. Linear correlation coefficient for global CFR varied between 0.71 (MH vs. 4DM-ROI) and 0.90 (FQ vs. 4DM-ROI). Using a cut-off value of 2.0 for abnormal CFR, the agreement among the software programs ranged between 76 % (MH vs. FQ) and 90 % (FQ vs. 4DM-ROI). Interobserver agreement was in general excellent with all software

  16. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  17. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Directory of Open Access Journals (Sweden)

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  18. Ignominy:a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages

    Institute of Scientific and Technical Information of China (English)

    LassiA.Tuura; LucasTaylor

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.

  19. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  20. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  1. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    Science.gov (United States)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  2. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    Science.gov (United States)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  3. Explicit models for dynamic software

    NARCIS (Netherlands)

    Bosloper, Ivor; Siljee, Johanneke; Nijhuis, Jos; Nord, R; Medvidovic, N; Krikhaar, R; Khrhaar, R; Stafford, J; Bosch, J

    2006-01-01

    A key aspect in creating autonomous dynamic software systems is the possibility of reasoning about properties of runtime variability and dynamic behavior, e.g. when and how to reconfigure the system. Currently these properties are often not made explicit in the software architecture. We argue that

  4. Explicit models for dynamic software

    NARCIS (Netherlands)

    Bosloper, Ivor; Siljee, Johanneke; Nijhuis, Jos; Nord, R; Medvidovic, N; Krikhaar, R; Khrhaar, R; Stafford, J; Bosch, J

    2006-01-01

    A key aspect in creating autonomous dynamic software systems is the possibility of reasoning about properties of runtime variability and dynamic behavior, e.g. when and how to reconfigure the system. Currently these properties are often not made explicit in the software architecture. We argue that h

  5. PCG: A software package for the iterative solution of linear systems on scalar, vector and parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, W. [Los Alamos National Lab., NM (United States); Carey, G.F. [Univ. of Texas, Austin, TX (United States)

    1994-12-31

    A great need exists for high performance numerical software libraries transportable across parallel machines. This talk concerns the PCG package, which solves systems of linear equations by iterative methods on parallel computers. The features of the package are discussed, as well as techniques used to obtain high performance as well as transportability across architectures. Representative numerical results are presented for several machines including the Connection Machine CM-5, Intel Paragon and Cray T3D parallel computers.

  6. DISPL: a software package for one and two spatially dimensioned kinetics-diffusion problems. [FORTRAN for IBM computers

    Energy Technology Data Exchange (ETDEWEB)

    Leaf, G K; Minkoff, M; Byrne, G D; Sorensen, D; Bleakney, T; Saltzman, J

    1978-11-01

    DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous media. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 17 figures, 9 tables.

  7. DISPL: a software package for one and two spatially dimensioned kinetics-diffusion problems. [In FORTRAN for IBM computers

    Energy Technology Data Exchange (ETDEWEB)

    Leaf, G.K.; Minkoff, M.; Byrne, G.D.; Sorensen, D.; Bleakney, T.; Saltzman, J.

    1977-05-01

    DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous medium. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 16 figures, 10 tables.

  8. Modeling superconducting networks containing Josephson junctions by means of PC-based circuit simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Blackburn, J.A. (Department of Physics and Computing, Wilfrid Laurier University, Waterloo, ON (Canada)); Smith, H.J.T. (Department of Physics, University of Waterloo, Waterloo, ON (Canada))

    1990-09-01

    Software packages are now available with which complex analog electronic circuits can be simulated on desktop computers. Using Micro Cap III it is demonstrated that the modeling capabilities of such software can be extended to include {ital superconducting} networks by means of an appropriate equivalent circuit for a Josephson junction.

  9. Neural Network Program Package for Prosody Modeling

    Directory of Open Access Journals (Sweden)

    J. Santarius

    2004-04-01

    Full Text Available This contribution describes the programme for one part of theautomatic Text-to-Speech (TTS synthesis. Some experiments (for example[14] documented the considerable improvement of the naturalness ofsynthetic speech, but this approach requires completing the inputfeature values by hand. This completing takes a lot of time for bigfiles. We need to improve the prosody by other approaches which useonly automatically classified features (input parameters. Theartificial neural network (ANN approach is used for the modeling ofprosody parameters. The program package contains all modules necessaryfor the text and speech signal pre-processing, neural network training,sensitivity analysis, result processing and a module for the creationof the input data protocol for Czech speech synthesizer ARTIC [1].

  10. The VENUS/NWChem Software Package. Tight Coupling Between Chemical Dynamics Simulations and Electronic Structure Theory

    Energy Technology Data Exchange (ETDEWEB)

    Lourderaj, Upakarasamy; Sun, Rui; De Jong, Wibe A.; Windus, Theresa L.; Hase, William L.

    2014-03-01

    The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling. The two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface which accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.

  11. MacMath 92 a dynamical systems software package for the Macintosh

    CERN Document Server

    Hubbard, John H

    1993-01-01

    MacMath is a scientific toolkit for the Macintosh computer consisting of twelve graphics programs. It supports mathematical computation and experimentation in dynamical systems, both for differential equations and for iteration. The MacMath package was designed to accompany the textbooks Differential Equations: A Dynamical Systems Approach Part I & II. The text and software was developed for a junior-senior level course in applicable mathematics at Cornell University, in order to take advantage of excellent and easily accessible graphics. MacMath addresses differential equations and iteration such as: analyzer, diffeq, phase plane, diffeq 3D views, numerical methods, periodic differential equations, cascade, 2D iteration, eigenfinder, jacobidraw, fourier, planets. These versatile programs greatly enhance the understanding of the mathematics in these topics. Qualitative analysis of the picture leads to quantitative results and even to new mathematics. This new edition includes the latest version of the Mac...

  12. Radcalc for Windows 2.0 transportation packaging software to determine hydrogen generation and transportation classification

    Energy Technology Data Exchange (ETDEWEB)

    Green, J.R.

    1996-10-21

    Radclac for Windows is a user friendly menu-driven Windows compatible software program with applications in the transportation of radioactive materials. It calculates the radiolytic generation of hydrogen gas in the matrix of low-level and high-level radioactive wastes. It also calculates pressure buildup due to hydrogen and the decay heat generated in a package at seal time. It computes the quantity of a radionuclide and its associated products for a given period of time. In addition, the code categorizes shipment quantities as reportable quantity (RQ), radioactive Type A or Type B, limited quality (LQ), low specific activity (LSA), highway road controlled quality (HRCQ), and fissile excepted using US Department of Transportation (DOT) definitions and methodologies.

  13. ELAN: a software package for analysis and visualization of MEG, EEG, and LFP signals.

    Science.gov (United States)

    Aguera, Pierre-Emmanuel; Jerbi, Karim; Caclin, Anne; Bertrand, Olivier

    2011-01-01

    The recent surge in computational power has led to extensive methodological developments and advanced signal processing techniques that play a pivotal role in neuroscience. In particular, the field of brain signal analysis has witnessed a strong trend towards multidimensional analysis of large data sets, for example, single-trial time-frequency analysis of high spatiotemporal resolution recordings. Here, we describe the freely available ELAN software package which provides a wide range of signal analysis tools for electrophysiological data including scalp electroencephalography (EEG), magnetoencephalography (MEG), intracranial EEG, and local field potentials (LFPs). The ELAN toolbox is based on 25 years of methodological developments at the Brain Dynamics and Cognition Laboratory in Lyon and was used in many papers including the very first studies of time-frequency analysis of EEG data exploring evoked and induced oscillatory activities in humans. This paper provides an overview of the concepts and functionalities of ELAN, highlights its specificities, and describes its complementarity and interoperability with other toolboxes.

  14. Software package to automate the design and production of translucent building structures made of pvc

    Directory of Open Access Journals (Sweden)

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  15. Online Rule Generation Software Process Model

    Directory of Open Access Journals (Sweden)

    Sudeep Marwaha

    2013-07-01

    Full Text Available For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified waterfall model for decision rules generation.

  16. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; K., Nirmal; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-01-01

    We have developed a low-cost off-the-shelf component star sensor (StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  17. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  18. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  19. Alloy Design Workbench-Surface Modeling Package Developed

    Science.gov (United States)

    Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.

    2003-01-01

    NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.

  20. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  1. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Science.gov (United States)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  2. The instrument control software package for the Habitable-Zone Planet Finder spectrometer

    Science.gov (United States)

    Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan

    2016-08-01

    We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.

  3. On the Incorrect Statistical Calculations of the Kinetica Software Package in Imbalanced Designs.

    Science.gov (United States)

    Morales-Alcelay, S; de la Torre de Alvarado, J M; García-Arieta, A

    2015-07-01

    This regulatory note supports the previous findings that suggest that the software package Kinetica, up to version 5.0.10, provides incorrect results for the 90% confidence intervals for the ratio test/reference where the groups are imbalanced in 2 × 2 crossover designs and parallel designs. The incorrect calculation results from using the simplified formula that is shown as an example in the Canadian guideline for a balanced dataset, but which provides an erroneous point estimate and confidence interval width in cases of imbalanced designs. Importantly, this software is rarely used for regulatory submissions in the European Union according to the search conducted in the Spanish Agency for Medicines and Health Care Products. According to our data, the error is minor if the imbalance between groups is small. However, the error may be relevant if the sample size is small and the imbalance is large. Therefore, bioequivalence studies should be reanalyzed by regulatory agencies to confirm the submitted results.

  4. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  5. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  6. Software applications for providing comprehensive computing capabilities to problems related to mixed models in animal breeding

    Institute of Scientific and Technical Information of China (English)

    Monchai; DAUNGJINDA

    2005-01-01

    Recently,several computer packages havebeen developed to accomplish problems relatedto mixed model in animal breeding.Special soft-ware for estimation of variance components andprediction of genetic merits are basically neededfor genetic evaluation and selection program.Al-though there are some packages available on theinternet,however,most of them are commercialor unfriendly to be used.The lists of recent soft-ware available on the internet are shown in Tab.1.Most software is free license(mostly for ac-ade...

  7. SOFTWARE RELIABILITY MODEL FOR COMPONENT INTERACTION MODE

    Institute of Scientific and Technical Information of China (English)

    Wang Qiang; Lu Yang; Xu Zijun; Han Jianghong

    2011-01-01

    With the rapid progress of component technology,the software development methodology of gathering a large number of components for designing complex software systems has matured.But,how to assess the application reliability accurately with the information of system architecture and the components reliabilities together has become a knotty problem.In this paper,the defects in formal description of software architecture and the limitations in existed model assumptions are both analyzed.Moreover,a new software reliability model called Component Interaction Mode (CIM) is proposed.With this model,the problem for existed component-based software reliability analysis models that cannot deal with the cases of component interaction with non-failure independent and non-random control transition is resolved.At last,the practice examples are presented to illustrate the effectiveness of this model

  8. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  9. MW PHARM, AN INTEGRATED SOFTWARE PACKAGE FOR DRUG-DOSAGE REGIMEN CALCULATION AND THERAPEUTIC DRUG-MONITORING

    NARCIS (Netherlands)

    PROOST, JH; MEIJER, DKF

    The pharmacokinetic software package MW/Pharm offers an interactive, user-friendly program which gives rapid answers in clinical practice. It comprises a database with pharmacokinetic parameters of 180 drugs, a medication history database, and procedures for an individual drug dosage regimen

  10. MW PHARM, AN INTEGRATED SOFTWARE PACKAGE FOR DRUG-DOSAGE REGIMEN CALCULATION AND THERAPEUTIC DRUG-MONITORING

    NARCIS (Netherlands)

    PROOST, JH; MEIJER, DKF

    1992-01-01

    The pharmacokinetic software package MW/Pharm offers an interactive, user-friendly program which gives rapid answers in clinical practice. It comprises a database with pharmacokinetic parameters of 180 drugs, a medication history database, and procedures for an individual drug dosage regimen calcula

  11. A software package for stellar and solar inverse-Compton emission: Stellarics

    CERN Document Server

    Orlando, Elena

    2013-01-01

    We present our software to compute gamma-ray emission from inverse-Compton scattering by cosmic-ray leptons in the heliosphere, as well as in the photospheres of stars. It includes a formulation of modulation in the heliosphere, but it can be used for any user-defined modulation model. Profiles and spectra are output to FITS files in a variety of forms for convenient use. Also included are general-purpose inverse-Compton routines with other features like energy loss rates and emissivity for any user-defined target photon and lepton spectra. The software is publicly available and it is under continuing development.

  12. A software package for Stellar and solar Inverse Compton emission: StellarICs

    Science.gov (United States)

    Orlando, Elena; Strong, Andrew

    2013-06-01

    We present our software to compute gamma-ray emission from inverse-Compton scattering by cosmic-ray leptons in the heliosphere, as well as in the photospheres of stars. It includes a formulation of modulation in the heliosphere, but can be used for any user-defined modulation model. Profiles and spectra are output to FITS files in a variety of forms for convenient use. Also included are general-purpose inverse-Compton routines with other features like energy loss rates and emissivity for any user-defined target photon and lepton spectra. The software is publicly available and it is under continuing development.

  13. topicmodels: An R Package for Fitting Topic Models

    Directory of Open Access Journals (Sweden)

    Bettina Grun

    2011-05-01

    Full Text Available Topic models allow the probabilistic modeling of term frequency occurrences in documents. The fitted model can be used to estimate the similarity between documents as well as between a set of specified keywords using an additional layer of latent variables which are referred to as topics. The R package topicmodels provides basic infrastructure for fitting topic models based on data structures from the text mining package tm. The package includes interfaces to two algorithms for fitting topic models: the variational expectation-maximization algorithm provided by David M. Blei and co-authors and an algorithm using Gibbs sampling by Xuan-Hieu Phan and co-authors.

  14. Power Electronic Packaging Design, Assembly Process, Reliability and Modeling

    CERN Document Server

    Liu, Yong

    2012-01-01

    Power Electronic Packaging presents an in-depth overview of power electronic packaging design, assembly,reliability and modeling. Since there is a drastic difference between IC fabrication and power electronic packaging, the book systematically introduces typical power electronic packaging design, assembly, reliability and failure analysis and material selection so readers can clearly understand each task's unique characteristics. Power electronic packaging is one of the fastest growing segments in the power electronic industry, due to the rapid growth of power integrated circuit (IC) fabrication, especially for applications like portable, consumer, home, computing and automotive electronics. This book also covers how advances in both semiconductor content and power advanced package design have helped cause advances in power device capability in recent years. The author extrapolates the most recent trends in the book's areas of focus to highlight where further improvement in materials and techniques can d...

  15. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    . This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system......The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... Essential features of the model have been implemented in a research prototype, Ragnarok. Two years of experience using Ragnarok in three, real, small- to medium sized, projects is reported. The conclusion is that the presented model is viable, feels 'natural' for developers, and provides good support...

  16. ZONE package of the Central Valley Hydrologic Model

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digital dataset defines the model grid, active cells in model layers 2 and 3, and geologic province arrays of the ZONE package used in the transient hydrologic...

  17. Network-Based Interpretation of Diverse High-Throughput Datasets through the Omics Integrator Software Package.

    Science.gov (United States)

    Tuncbag, Nurcan; Gosline, Sara J C; Kedaigle, Amanda; Soltis, Anthony R; Gitter, Anthony; Fraenkel, Ernest

    2016-04-01

    High-throughput, 'omic' methods provide sensitive measures of biological responses to perturbations. However, inherent biases in high-throughput assays make it difficult to interpret experiments in which more than one type of data is collected. In this work, we introduce Omics Integrator, a software package that takes a variety of 'omic' data as input and identifies putative underlying molecular pathways. The approach applies advanced network optimization algorithms to a network of thousands of molecular interactions to find high-confidence, interpretable subnetworks that best explain the data. These subnetworks connect changes observed in gene expression, protein abundance or other global assays to proteins that may not have been measured in the screens due to inherent bias or noise in measurement. This approach reveals unannotated molecular pathways that would not be detectable by searching pathway databases. Omics Integrator also provides an elegant framework to incorporate not only positive data, but also negative evidence. Incorporating negative evidence allows Omics Integrator to avoid unexpressed genes and avoid being biased toward highly-studied hub proteins, except when they are strongly implicated by the data. The software is comprised of two individual tools, Garnet and Forest, that can be run together or independently to allow a user to perform advanced integration of multiple types of high-throughput data as well as create condition-specific subnetworks of protein interactions that best connect the observed changes in various datasets. It is available at http://fraenkel.mit.edu/omicsintegrator and on GitHub at https://github.com/fraenkel-lab/OmicsIntegrator.

  18. Network-Based Interpretation of Diverse High-Throughput Datasets through the Omics Integrator Software Package.

    Directory of Open Access Journals (Sweden)

    Nurcan Tuncbag

    2016-04-01

    Full Text Available High-throughput, 'omic' methods provide sensitive measures of biological responses to perturbations. However, inherent biases in high-throughput assays make it difficult to interpret experiments in which more than one type of data is collected. In this work, we introduce Omics Integrator, a software package that takes a variety of 'omic' data as input and identifies putative underlying molecular pathways. The approach applies advanced network optimization algorithms to a network of thousands of molecular interactions to find high-confidence, interpretable subnetworks that best explain the data. These subnetworks connect changes observed in gene expression, protein abundance or other global assays to proteins that may not have been measured in the screens due to inherent bias or noise in measurement. This approach reveals unannotated molecular pathways that would not be detectable by searching pathway databases. Omics Integrator also provides an elegant framework to incorporate not only positive data, but also negative evidence. Incorporating negative evidence allows Omics Integrator to avoid unexpressed genes and avoid being biased toward highly-studied hub proteins, except when they are strongly implicated by the data. The software is comprised of two individual tools, Garnet and Forest, that can be run together or independently to allow a user to perform advanced integration of multiple types of high-throughput data as well as create condition-specific subnetworks of protein interactions that best connect the observed changes in various datasets. It is available at http://fraenkel.mit.edu/omicsintegrator and on GitHub at https://github.com/fraenkel-lab/OmicsIntegrator.

  19. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  20. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  1. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  2. GAMOLA2, a Comprehensive Software Package for the Annotation and Curation of Draft and Complete Microbial Genomes.

    Science.gov (United States)

    Altermann, Eric; Lu, Jingli; McCulloch, Alan

    2017-01-01

    Expert curated annotation remains one of the critical steps in achieving a reliable biological relevant annotation. Here we announce the release of GAMOLA2, a user friendly and comprehensive software package to process, annotate and curate draft and complete bacterial, archaeal, and viral genomes. GAMOLA2 represents a wrapping tool to combine gene model determination, functional Blast, COG, Pfam, and TIGRfam analyses with structural predictions including detection of tRNAs, rRNA genes, non-coding RNAs, signal protein cleavage sites, transmembrane helices, CRISPR repeats and vector sequence contaminations. GAMOLA2 has already been validated in a wide range of bacterial and archaeal genomes, and its modular concept allows easy addition of further functionality in future releases. A modified and adapted version of the Artemis Genome Viewer (Sanger Institute) has been developed to leverage the additional features and underlying information provided by the GAMOLA2 analysis, and is part of the software distribution. In addition to genome annotations, GAMOLA2 features, among others, supplemental modules that assist in the creation of custom Blast databases, annotation transfers between genome versions, and the preparation of Genbank files for submission via the NCBI Sequin tool. GAMOLA2 is intended to be run under a Linux environment, whereas the subsequent visualization and manual curation in Artemis is mobile and platform independent. The development of GAMOLA2 is ongoing and community driven. New functionality can easily be added upon user requests, ensuring that GAMOLA2 provides information relevant to microbiologists. The software is available free of charge for academic use.

  3. Constraint Network Analysis (CNA): a Python software package for efficiently linking biomacromolecular structure, flexibility, (thermo-)stability, and function.

    Science.gov (United States)

    Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger

    2013-04-22

    For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.

  4. A software package for evaluating the performance of a star sensor operation

    CERN Document Server

    Sarpotdar, Mayuresh; Sreejith, A G; Nirmal, K; Ambily, S; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2016-01-01

    We have developed a low-cost off-the-shelf component star sensor (StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each...

  5. ELAN: A Software Package for Analysis and Visualization of MEG, EEG, and LFP Signals

    Directory of Open Access Journals (Sweden)

    Pierre-Emmanuel Aguera

    2011-01-01

    Full Text Available The recent surge in computational power has led to extensive methodological developments and advanced signal processing techniques that play a pivotal role in neuroscience. In particular, the field of brain signal analysis has witnessed a strong trend towards multidimensional analysis of large data sets, for example, single-trial time-frequency analysis of high spatiotemporal resolution recordings. Here, we describe the freely available ELAN software package which provides a wide range of signal analysis tools for electrophysiological data including scalp electroencephalography (EEG, magnetoencephalography (MEG, intracranial EEG, and local field potentials (LFPs. The ELAN toolbox is based on 25 years of methodological developments at the Brain Dynamics and Cognition Laboratory in Lyon and was used in many papers including the very first studies of time-frequency analysis of EEG data exploring evoked and induced oscillatory activities in humans. This paper provides an overview of the concepts and functionalities of ELAN, highlights its specificities, and describes its complementarity and interoperability with other toolboxes.

  6. A software package for predicting design-flood hydrographs in small and ungauged basins

    Directory of Open Access Journals (Sweden)

    Rodolfo Piscopia

    2015-06-01

    Full Text Available In this study, software for estimating design hydrographs in small and ungauged basins is presented. The main aim is to propose a fast and user-friendly empirical tool that the practitioner can apply for hydrological studies characterised by a lack of observed data. The software implements a homonymous framework called event-based approach for small and ungauged basins (EBA4SUB that was recently developed and tested by the authors to estimate the design peak discharge using the same input information necessary to apply the rational formula. EBA4SUB is a classical hydrological event-based model in which each step (design hyetograph, net rainfall estimation, and rainfall-runoff transformation is appropriately adapted for empirical applications without calibration. As a case study, the software is applied in a small watershed while varying the hyetograph shape, rainfall peak position, and return time. The results provide an overview of the software and confirm the secondary role of the design rainfall peak position.

  7. A simplified model of software project dynamics

    OpenAIRE

    Ruiz Carreira, Mercedes; Ramos Román, Isabel; Toro Bonilla, Miguel

    2001-01-01

    The simulation of a dynamic model for software development projects (hereinafter SDPs) helps to investigate the impact of a technological change, of different management policies, and of maturity level of organisations over the whole project. In the beginning of the 1990s, with the appearance of the dynamic model for SDPs by Abdel-Hamid and Madnick [Software Project Dynamics: An Integrated Approach, Prentice-Hall, Englewood Cliffs, NJ, 1991], a significant advance took place in the field of p...

  8. Model Checking Software Systems: A Case Study.

    Science.gov (United States)

    1995-03-10

    gained. We suggest a radically different tack: model checking. The two formal objects compared are a finite state machine model of the software...simply terminates. 3.1.1. State Machine Model Let’s consider a simplified model with just one client, one server, and one file. The top graph

  9. NI-79RAPID ASSESSMENT OF LESION VOLUMES FOR PATIENTS WITH GLIOMA USING THE SMARTBRUSH SOFTWARE PACKAGE

    Science.gov (United States)

    Vaziri, Sana; Lafontaine, Marisa; Olson, Beck; Crane, Jason C.; Chang, Susan; Lupo, Janine; Nelson, Sarah J.

    2014-01-01

    The increasing interest in enhancing the RANO criteria by using quantitative assessments of changes in lesion size and image intensities has highlighted the need for rapid, easy-to-use tools that provide DICOM compatible outputs for evaluation of patients with glioma. To evaluate the performance of the SmartBrush software (Brainlab AG), which provides computer-assisted definitions of regions of interest (ROIs), a cohort of 20 patients with glioma (equal number having high and low grade and treated and un-treated) were scanned using a 3T whole-body MR system prior to surgical resection. The T2-weighted FLAIR, pre- and post-contrast T1-weighted gradient echo DICOM images were pushed from the scanner to an offline workstation where analysis of lesion volumes was performed using SmartBrush. Volumes of the T2Ls ranged from 7.9 to 110.2cm3 and the volumes of the CELs was 0.1 to 28.5 cm3 with 19/20 of the subjects having CELs and all 20 having T2Ls. the computer-assisted analysis was performed rapidly and efficiently, with the mean time for defining both lesions per subject was 5.77 (range 3.5 to 7.5) minutes. Prior analysis of ROIS with the SLICER package (www.slicer.org) took approximately 30 minutes/subject. SmartBrush provides lesion volumes and cross-sectional diameter as a PDF report, which can be stored in DICOM. The ROIs were also saved as DICOM objects and transferred to other packages for performing histogram analysis from ADC or other functional parameter maps. Ongoing studies that will be reported in this presentation are performing a similar analysis with multiple users in order to compare the relative intra- and inter-operator variations in terms of both the speed of analysis and the ROIs that are identified. Acknowledgements: The authors would like to acknowledge Rowena Thomson and Natalie Wright from Brainlab for helping to set up this study.

  10. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  11. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  12. Development of problem-oriented software packages for numerical studies and computer-aided design (CAD) of gyrotrons

    Science.gov (United States)

    Damyanova, M.; Sabchevski, S.; Zhelyazkov, I.; Vasileva, E.; Balabanova, E.; Dankov, P.; Malinov, P.

    2016-03-01

    Gyrotrons are the most powerful sources of coherent CW (continuous wave) radiation in the frequency range situated between the long-wavelength edge of the infrared light (far-infrared region) and the microwaves, i.e., in the region of the electromagnetic spectrum which is usually called the THz-gap (or T-gap), since the output power of other devices (e.g., solid-state oscillators) operating in this interval is by several orders of magnitude lower. In the recent years, the unique capabilities of the sub-THz and THz gyrotrons have opened the road to many novel and future prospective applications in various physical studies and advanced high-power terahertz technologies. In this paper, we present the current status and functionality of the problem-oriented software packages (most notably GYROSIM and GYREOSS) used for numerical studies, computer-aided design (CAD) and optimization of gyrotrons for diverse applications. They consist of a hierarchy of codes specialized to modelling and simulation of different subsystems of the gyrotrons (EOS, resonant cavity, etc.) and are based on adequate physical models, efficient numerical methods and algorithms.

  13. SOFTWARE SOLUTIONS FOR ARDL MODELS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2015-07-01

    Full Text Available VAR type models can be used only for stationary time series. Causality analyses through econometric models need that series to have the same integrated order. Usually, when constraining the series to comply these restrictions (e.g. by differentiating, economic interpretation of the outcomes may become difficult. Recent solution for mitigating these problems is the use of ARDL (autoregressive distributed lag models. We present implementation in E-Views of these models and we test the impact of exchange rate on consumer price index.

  14. A Software Reliability Model Using Quantile Function

    Directory of Open Access Journals (Sweden)

    Bijamma Thomas

    2014-01-01

    Full Text Available We study a class of software reliability models using quantile function. Various distributional properties of the class of distributions are studied. We also discuss the reliability characteristics of the class of distributions. Inference procedures on parameters of the model based on L-moments are studied. We apply the proposed model to a real data set.

  15. CAMBIO: software for modelling and simulation of bioprocesses.

    Science.gov (United States)

    Farza, M; Chéruy, A

    1991-07-01

    CAMBIO, a software package devoted to bioprocess modelling, which runs on Apollo computers, is described. This software enables bioengineers to easily and interactively design appropriate mathematical models directly from their perception of the process. CAMBIO provides the user with a set of design symbols and mnemonic icons in order to interactively design a functional diagram. This diagram has to exhibit the most relevant components with their related interactions through biological and physico-chemical reactions. Then, CAMBIO automatically generates the dynamical material balance equations of the process in the form of an algebraic-differential system by taking advantage of the knowledge involved in the functional diagram. The model may be used for control design purpose or completed by kinetics expressions with a view to simulation. CAMBIO offers facilities to generate a simulation model (for coding of kinetics, introducing auxiliary variables, etc.). This model is automatically interfaced with a specialized simulation software which allows an immediate visualization of the process dynamical behaviour under various operational conditions (possibly involving feedback control strategies). An example of an application dealing with yeast fermentation is given.

  16. A Censored Nonparametric Software Reliability Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper analyses the effct of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs well with censored data.

  17. Software Architecture Viewpoint Models: A Short Survey

    Directory of Open Access Journals (Sweden)

    Seyyed Ali Razavi Ebrahimi

    2013-11-01

    Full Text Available A software architecture is a complex entity that cannot be described in a simple one-dimensional fashion. The architecture views used to describe software provide the architect with a means of explaining the architecture to stakeholders. Each view presents different aspects of the system that fulfill functional and non-functional requirements. A view of a system is a representation of the system from the perspective of a viewpoint. Architecture viewpoints in software products provide guidelines to describe uniformly the total system and its subsystems. It defines the stakeholders whose concerns are reflected in the viewpoint and the guidelines, principles, and template models for constructing its views. The results of this study may serve as a roadmap to the software developers and architects in helping them select the appropriate viewpoint model based on the stakeholders and concerns that need to be covered by views.

  18. The khmer software package: enabling efficient nucleotide sequence analysis [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Michael R. Crusoe

    2015-09-01

    Full Text Available The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at https://github.com/dib-lab/khmer/.

  19. Software engineering the mixed model for genome-wide association studies on large samples.

    Science.gov (United States)

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  20. Model Oriented Approach for Industrial Software Development

    Directory of Open Access Journals (Sweden)

    P. D. Drobintsev

    2015-01-01

    Full Text Available The article considers the specifics of a model oriented approach to software development based on the usage of Model Driven Architecture (MDA, Model Driven Software Development (MDSD and Model Driven Development (MDD technologies. Benefits of this approach usage in the software development industry are described. The main emphasis is put on the system design, automated code generation for large systems, verification, proof of system properties and reduction of bug density. Drawbacks of the approach are also considered. The approach proposed in the article is specific for industrial software systems development. These systems are characterized by different levels of abstraction, which is used on modeling and code development phases. The approach allows to detail the model to the level of the system code, at the same time store the verified model semantics and provide the checking of the whole detailed model. Steps of translating abstract data structures (including transactions, signals and their parameters into data structures used in detailed system implementation are presented. Also the grammar of a language for specifying rules of abstract model data structures transformation into real system detailed data structures is described. The results of applying the proposed method in the industrial technology are shown.The article is published in the authors’ wording.

  1. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  2. The Modularized Software Package ASKI - Full Waveform Inversion Based on Waveform Sensitivity Kernels Utilizing External Seismic Wave Propagation Codes

    Science.gov (United States)

    Schumacher, F.; Friederich, W.

    2015-12-01

    We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full

  3. Powder diffraction pattern fitting and structure refinement by means of the CPSR v.3.1 software package

    Science.gov (United States)

    Andreev, Yu. G.; Lundström, T.; Sorokin, N. I.

    1995-02-01

    An updated version of the CPSR software package for powder pattern fitting and structure refinement offers major advantages over previous versions. An optional use of the new figure-of-merit function, that takes into account a systematic behaviour of residuals, allows users to reduce the effect of local correlations at the full-profile fitting stage, thus providing more reliable estimates for integrated intensities and their deviances. The structure refinement stage in such a case yields accurate values for estimated standard deviations of structural parameters since, in addition, model errors affecting calculated integrated intensities are taken into consideration. Furthermore, the new CPSR version is customized for a variety of constant-wavelength neutron and X-ray diffraction techniques and is equipped with an enhanced menu structure. Graphical on-screen-controlled support allows users to follow the progress of a fitting procedure over any region of a powder pattern. The program performance is illustrated using the neutron diffraction data file for PbSO 4 distributed during the Rietveld refinement round robin, organized by the IUCr Commission on Powder Diffraction.

  4. Management models in the NZ software industry

    Directory of Open Access Journals (Sweden)

    Holger Spill

    Full Text Available This research interviewed eight innovative New Zealand software companies to find out how they manage new product development. It looked at how management used standard techniques of software development to manage product uncertainty through the theoretical lens of the Cyclic Innovation Model. The study found that while there is considerable variation, the management of innovation was largely determined by the level of complexity. Organizations with complex innovative software products had a more iterative software development style, more flexible internal processes and swifter decision-making. Organizations with less complexity in their products tended to use more formal structured approaches. Overall complexity could be inferred with reference to four key factors within the development environment.

  5. Graphical modelling software in R - status

    DEFF Research Database (Denmark)

    Dethlefsen, Claus; Højsgaard, Søren; Lauritzen, Steffen L.

    Graphical models in their modern form have been around for nearly a quarter of a century.  Various computer programs for inference in graphical models have been developed over that period. Some examples of free software programs are BUGS (Thomas 1994), CoCo (Badsberg2001), Digram (Klein, Keiding...

  6. inventory management, VMI, software agents, MDV model

    Directory of Open Access Journals (Sweden)

    Waldemar Wieczerzycki

    2012-03-01

    Full Text Available Background: As it is well know, the implementation of instruments of logistics management is only possible with the use of the latest information technology. So-called agent technology is one of the most promising solutions in this area. Its essence consists in an entirely new way of software distribution on the computer network platform, in which computer exchange among themselves not only data, but also software modules, called just agents. The first aim is to propose the alternative method of the implementation of the concept of the inventory management by the supplier with the use of intelligent software agents, which are able not only to transfer the information but also to make the autonomous decisions based on the privileges given to them. The second aim of this research was to propose a new model of a software agent, which will be both of a high mobility and a high intelligence. Methods: After a brief discussion of the nature of agent technology, the most important benefits of using it to build platforms to support business are given. Then the original model of polymorphic software agent, called Multi-Dimensionally Versioned Software Agent (MDV is presented, which is oriented on the specificity of IT applications in business. MDV agent is polymorphic, which allows the transmission through the network only the most relevant parts of its code, and only when necessary. Consequently, the network nodes exchange small amounts of software code, which ensures high mobility of software agents, and thus highly efficient operation of IT platforms built on the proposed model. Next, the adaptation of MDV software agents to implementation of well-known logistics management instrument - VMI (Vendor Managed Inventory is illustrated. Results: The key benefits of this approach are identified, among which one can distinguish: reduced costs, higher flexibility and efficiency, new functionality - especially addressed to business negotiation, full automation

  7. User-friendly software for modeling collective spin wave excitations

    Science.gov (United States)

    Hahn, Steven; Peterson, Peter; Fishman, Randy; Ehlers, Georg

    There exists a great need for user-friendly, integrated software that assists in the scientific analysis of collective spin wave excitations measured with inelastic neutron scattering. SpinWaveGenie is a C + + software library that simplifies the modeling of collective spin wave excitations, allowing scientists to analyze neutron scattering data with sophisticated models fast and efficiently. Furthermore, one can calculate the four-dimensional scattering function S(Q,E) to directly compare and fit calculations to experimental measurements. Its generality has been both enhanced and verified through successful modeling of a wide array of magnetic materials. Recently, we have spent considerable effort transforming SpinWaveGenie from an early prototype to a high quality free open source software package for the scientific community. S.E.H. acknowledges support by the Laboratory's Director's fund, ORNL. Work was sponsored by the Division of Scientific User Facilities, Office of Basic Energy Sciences, US Department of Energy, under Contract No. DE-AC05-00OR22725 with UT-Battelle, LLC.

  8. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  9. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  10. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR

    Energy Technology Data Exchange (ETDEWEB)

    Tejero, Roberto [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States); Snyder, David [William Paterson University, Department of Chemistry (United States); Mao, Binchen; Aramini, James M.; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States)

    2013-08-15

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data.

  11. Documentation pckage for the RFID temperature monitoring system (Of Model 9977 packages at NTS).

    Energy Technology Data Exchange (ETDEWEB)

    Chen, K.; Tsai, H.; Decision and Information Sciences

    2009-02-20

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it can be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The

  12. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    DEFF Research Database (Denmark)

    Ashraf, Haseem; de Hoop, B; Shaker, S B;

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms.......We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms....

  13. The gRbase package for graphical modelling in R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Dethlefsen, Claus

    We have developed a package, called , consisting of a number of classes and associated methods to support the analysis of data using graphical models. It is developed for the open source language, R, and is available for several platforms. The package is intended to be widely extendible and flexi...... these building blocks can be combined and integrated with inference engines in the special cases of hierarchical log-linear models (undirected models). gRbase gRbase dynamicGraph...... and flexible so that package developers may implement further types of graphical models using the available methods. contains methods for representing data, specification of models using a formal language, and is linked to , an interactive graphical user interface for manipulating graphs. We show how...

  14. Structural Equation Modeling Diagnostics Using R Package Semdiag and EQS

    Science.gov (United States)

    Yuan, Ke-Hai; Zhang, Zhiyong

    2012-01-01

    Yuan and Hayashi (2010) introduced 2 scatter plots for model and data diagnostics in structural equation modeling (SEM). However, the generation of the plots requires in-depth understanding of their underlying technical details. This article develops and introduces an R package semdiag for easily drawing the 2 plots. With a model specified in EQS…

  15. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  16. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  17. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  18. Hidden Semi Markov Models for Multiple Observation Sequences: The mhsmm Package for R

    DEFF Research Database (Denmark)

    O'Connell, Jarad Michael; Højsgaard, Søren

    2011-01-01

    This paper describes the R package mhsmm which implements estimation and prediction methods for hidden Markov and semi-Markov models for multiple observation sequences. Such techniques are of interest when observed data is thought to be dependent on some unobserved (or hidden) state. Hidden Markov...... models only allow a geometrically distributed sojourn time in a given state, while hidden semi-Markov models extend this by allowing an arbitrary sojourn distribution. We demonstrate the software with simulation examples and an application involving the modelling of the ovarian cycle of dairy cows...

  19. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......, assisting with bridging the different engineering disciplines. Timing play an important role in agricultural robotic applications, synchronisation of robot movement and implement actions is important in order to achieve precision spraying, me- chanical weeding, individual feeding, etc. Discovering...

  20. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  1. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  2. GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.

    Science.gov (United States)

    Nazarian, Alireza; Gezan, Salvador Alejandro

    2016-07-01

    Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. © The American Genetic Association. 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Mass Transfer Model for a Breached Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    C. Hsu; J. McClure

    2004-07-26

    The degradation of waste packages, which are used for the disposal of spent nuclear fuel in the repository, can result in configurations that may increase the probability of criticality. A mass transfer model is developed for a breached waste package to account for the entrainment of insoluble particles. In combination with radionuclide decay, soluble advection, and colloidal transport, a complete mass balance of nuclides in the waste package becomes available. The entrainment equations are derived from dimensionless parameters such as drag coefficient and Reynolds number and based on the assumption that insoluble particles are subjected to buoyant force, gravitational force, and drag force only. Particle size distributions are utilized to calculate entrainment concentration along with geochemistry model abstraction to calculate soluble concentration, and colloid model abstraction to calculate colloid concentration and radionuclide sorption. Results are compared with base case geochemistry model, which only considers soluble advection loss.

  4. SAPHIRE models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  5. Coalval: A prefeasibility software package for evaluating coal properties using lotus 1-2-3, release 2. 2. Documentation and user's guide. Information circular/1993

    Energy Technology Data Exchange (ETDEWEB)

    Plis, M.N.; Rohrbacher, T.J.; Teeters, D.D.

    1993-01-01

    The U.S. Bureau of Mines report presents the documentation for COALVAL, a coal property evaluation software package developed on Lotus 1-2-3, version 2.2, spreadsheets. The software is compatible with version 3.1 as well, and may provisionally be run on the earlier 2.01 version. COALVAL is a menu-driven program that produces a prefeasibility-level cost analysis of mine-planned coal resources. The package contains cost models for each of five coal mining methods commonly employed in Appalachia: auger, contour strip, mountain top removal, continuous miner, and longwall. Other models, such as a dragline cost model, will be incorporated as the Bureau's Coal Recoverability Program matures. COALVAL allows mine operators, evaluators, consultants, and Government entities to input resource data and the various production, operating, and cost variables that pertain to their property. The program can evaluate up to 25 seams, each to be mined with up to five different mining methods, within a given area. Summary spreadsheets listing the cost per clean ton to mine the resources, f.o.b. the tipple, are produced for each property, seam, and mining method/seam combination.

  6. Memoised Garbage Collection for Software Model Checking

    NARCIS (Netherlands)

    Nguyen, V.Y.; Ruys, T.C.; Kowalewski, S.; Philippou, A.

    Virtual machine based software model checkers like JPF and MoonWalker spend up to half of their veri��?cation time on garbage collection. This is no surprise as after nearly each transition the heap has to be cleaned from garbage. To improve this, this paper presents the Memoised Garbage Collection

  7. PmagPy: Software package for paleomagnetic data analysis and a bridge to the Magnetics Information Consortium (MagIC) Database

    Science.gov (United States)

    Tauxe, L.; Shaar, R.; Jonestrask, L.; Swanson-Hysell, N. L.; Minnett, R.; Koppers, A. A. P.; Constable, C. G.; Jarboe, N.; Gaastra, K.; Fairchild, L.

    2016-06-01

    The Magnetics Information Consortium (MagIC) database provides an archive with a flexible data model for paleomagnetic and rock magnetic data. The PmagPy software package is a cross-platform and open-source set of tools written in Python for the analysis of paleomagnetic data that serves as one interface to MagIC, accommodating various levels of user expertise. PmagPy facilitates thorough documentation of sampling, measurements, data sets, visualization, and interpretation of paleomagnetic and rock magnetic experimental data. Although not the only route into the MagIC database, PmagPy makes preparation of newly published data sets for contribution to MagIC as a byproduct of normal data analysis and allows manipulation as well as reanalysis of data sets downloaded from MagIC with a single software package. The graphical user interface (GUI), Pmag GUI enables use of much of PmagPy's functionality, but the full capabilities of PmagPy extend well beyond that. Over 400 programs and functions can be called from the command line interface mode, or from within the interactive Jupyter notebooks. Use of PmagPy within a notebook allows for documentation of the workflow from the laboratory to the production of each published figure or data table, making research results fully reproducible. The PmagPy design and its development using GitHub accommodates extensions to its capabilities through development of new tools by the user community. Here we describe the PmagPy software package and illustrate the power of data discovery and reuse through a reanalysis of published paleointensity data which illustrates how the effectiveness of selection criteria can be tested.

  8. Development and Evaluation of an Open-Source Software Package “CGITA” for Quantifying Tumor Heterogeneity with Molecular Images

    Directory of Open Access Journals (Sweden)

    Yu-Hua Dean Fang

    2014-01-01

    Full Text Available Background. The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA toolbox, and provide it to the research community as a free, open-source project. Methods. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC cancer patients treated with definitive radiotherapies. Results. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC. Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.. Conclusions. CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.

  9. Development and evaluation of an open-source software package "CGITA" for quantifying tumor heterogeneity with molecular images.

    Science.gov (United States)

    Fang, Yu-Hua Dean; Lin, Chien-Yu; Shih, Meng-Jung; Wang, Hung-Ming; Ho, Tsung-Ying; Liao, Chun-Ta; Yen, Tzu-Chen

    2014-01-01

    The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA) toolbox, and provide it to the research community as a free, open-source project. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC) cancer patients treated with definitive radiotherapies. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC). Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.). CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.

  10. Developing Project Duration Models in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Pierre Bourque; Serge Oligny; Alain Abran; Bertrand Fournier

    2007-01-01

    Based on the empirical analysis of data contained in the International Software Benchmarking Standards Group(ISBSG) repository, this paper presents software engineering project duration models based on project effort. Duration models are built for the entire dataset and for subsets of projects developed for personal computer, mid-range and mainframeplatforms. Duration models are also constructed for projects requiring fewer than 400 person-hours of effort and for projectsre quiring more than 400 person-hours of effort. The usefulness of adding the maximum number of assigned resources as asecond independent variable to explain duration is also analyzed. The opportunity to build duration models directly fromproject functional size in function points is investigated as well.

  11. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V&V guideline packages and procedures. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V&V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V&V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, {open_quotes}User`s Manual.{close_quotes} Three factors determine what V&V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V&V that is needed (as judged from an assessment of the system`s complexity and the requirement for its integrity to form three Classes). A V&V Guideline package is provided for each of the combinations of these three variables. The package specifies the V&V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V&V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems.

  12. Evaluation of a software package for automated quality assessment of contrast detail images--comparison with subjective visual assessment.

    Science.gov (United States)

    Pascoal, A; Lawinski, C P; Honey, I; Blake, P

    2005-12-07

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA(detector), which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  13. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pascoal, A [Medical Engineering and Physics, King' s College London, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Lawinski, C P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Honey, I [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Blake, P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark)

    2005-12-07

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA{sub detector}, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  14. Evaluation of a software package for automated quality assessment of contrast detail images—comparison with subjective visual assessment

    Science.gov (United States)

    Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.

    2005-12-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  15. Plans for performance and model improvements in the LISE++ software

    Science.gov (United States)

    Kuchera, M. P.; Tarasov, O. B.; Bazin, D.; Sherrill, B. M.; Tarasova, K. V.

    2016-06-01

    The LISE++ software for fragment separator simulations is undergoing a major update. LISE++ is the standard software used at in-flight separator facilities for predicting beam intensity and purity. The code simulates nuclear physics experiments where fragments are produced and then selected with a fragment separator. A set of modifications to improve the functionality of the code is discussed in this work. These modifications include transportation to a modern graphics framework and updated compilers to aid in the performance and sustainability of the code. To accommodate the diversity of our users' computer platform preferences, we extend the software from Windows to a cross-platform application. The calculations of beam transport and isotope production are becoming more computationally intense with the new large scale facilities. Planned new features include new types of optimization, for example, optimization of ion optics, improvements in reaction models, and new event generator options. In addition, LISE++ interface with control systems are planned. Computational improvements as well as the schedule for updating this large package will be discussed.

  16. A Kinetic Model for Predicting the Relative Humidity in Modified Atmosphere Packaging and Its Application in Lentinula edodes Packages

    Directory of Open Access Journals (Sweden)

    Li-xin Lu

    2013-01-01

    Full Text Available Adjusting and controlling the relative humidity (RH inside package is crucial for ensuring the quality of modified atmosphere packaging (MAP of fresh produce. In this paper, an improved kinetic model for predicting the RH in MAP was developed. The model was based on heat exchange and gases mass transport phenomena across the package, gases heat convection inside the package, and mass and heat balances accounting for the respiration and transpiration behavior of fresh produce. Then the model was applied to predict the RH in MAP of fresh Lentinula edodes (one kind of Chinese mushroom. The model equations were solved numerically using Adams-Moulton method to predict the RH in model packages. In general, the model predictions agreed well with the experimental data, except that the model predictions were slightly high in the initial period. The effect of the initial gas composition on the RH in packages was notable. In MAP of lower oxygen and higher carbon dioxide concentrations, the ascending rate of the RH was reduced, and the RH inside packages was saturated slowly during storage. The influence of the initial gas composition on the temperature inside package was not much notable.

  17. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...

  18. MODEL 9975 SHIPPING PACKAGE FABRICATION PROBLEMS AND SOLUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    May, C; Allen Smith, A

    2008-05-07

    The Model 9975 Shipping Package is the latest in a series (9965, 9968, etc.) of radioactive material shipping packages that have been the mainstay for shipping radioactive materials for several years. The double containment vessels are relatively simple designs using pipe and pipe cap in conjunction with the Chalfont closure to provide a leak-tight vessel. The fabrication appears simple in nature, but the history of fabrication tells us there are pitfalls in the different fabrication methods and sequences. This paper will review the problems that have arisen during fabrication and precautions that should be taken to meet specifications and tolerances. The problems and precautions can also be applied to the Models 9977 and 9978 Shipping Packages.

  19. Expanded Content Envelope For The Model 9977 Packaging

    Energy Technology Data Exchange (ETDEWEB)

    Abramczyk, G. A.; Loftin, B. M.; Nathan, S. J.; Bellamy, J. S.

    2013-07-30

    An Addendum was written to the Model 9977 Safety Analysis Report for Packaging adding a new content consisting of DOE-STD-3013 stabilized plutonium dioxide materials to the authorized Model 9977 contents. The new Plutonium Oxide Content (PuO{sub 2}) Envelope will support the Department of Energy shipment of materials between Los Alamos National Laboratory and Savannah River Site facilities. The new content extended the current content envelope boundaries for radioactive material mass and for decay heat load and required a revision to the 9977 Certificate of Compliance prior to shipment. The Addendum documented how the new contents/configurations do not compromise the safety basis presented in the 9977 SARP Revision 2. The changes from the certified package baseline and the changes to the package required to safely transport this material is discussed.

  20. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  1. Software systems for modeling articulated figures

    Science.gov (United States)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  2. VR Lab ISS Graphics Models Data Package

    Science.gov (United States)

    Paddock, Eddie; Homan, Dave; Bell, Brad; Miralles, Evely; Hoblit, Jeff

    2016-01-01

    All the ISS models are saved in AC3D model format which is a text based format that can be loaded into blender and exported to other formats from there including FBX. The models are saved in two different levels of detail, one being labeled "LOWRES" and the other labeled "HIRES". There are two ".str" files (HIRES _ scene _ load.str and LOWRES _ scene _ load.str) that give the hierarchical relationship of the different nodes and the models associated with each node for both the "HIRES" and "LOWRES" model sets. All the images used for texturing are stored in Windows ".bmp" format for easy importing.

  3. ORGANIZATIONAL LEARNING AND VENDOR SUPPORT QUALITY BY THE USAGE OF APPLICATION SOFTWARE PACKAGES: A STUDY OF ASIAN ENTREPRENEURS

    Institute of Scientific and Technical Information of China (English)

    Nelson Oly NDUBISI; Omprakash K.GUPTA; Samia MASSOUD

    2003-01-01

    In this paper we study how or ganizational learning impacts organizational behavior, and how vendor support quality enhances product adoption and usage behavior. These constructs were verified using Application Software Packages (ASP) - a prewritten, precoded, commercially available set of programs that eliminates the need for individuals or organizations to write their own software programs for certain functions. The relationship between ASP usage, usage outcomes and use processes were also investigated. Two hundred and ninety-five Chinese, Indian, and Malay entrepreneurships were studied. It was found that usage outcome strongly determines usage, while use process has only an indirect relationship (via outcome) on usage. The impact of organizational learning and vendor service quality on usage, usage outcome, and use process were robust. Theoretical and practical implications ofthe research are discussed.

  4. The consequences of a new software package for the quantification of gated-SPECT myocardial perfusion studies

    Energy Technology Data Exchange (ETDEWEB)

    Veen, Berlinda J. van der; Dibbets-Schneider, Petra; Stokkel, Marcel P.M. [Leiden University Medical Center, Department of Nuclear Medicine, Leiden (Netherlands); Scholte, Arthur J. [Leiden University Medical Center, Department of Cardiology, Leiden (Netherlands)

    2010-09-15

    Semiquantitative analysis of myocardial perfusion scintigraphy (MPS) has reduced inter- and intraobserver variability, and enables researchers to compare parameters in the same patient over time, or between groups of patients. There are several software packages available that are designed to process MPS data and quantify parameters. In this study the performances of two systems, quantitative gated SPECT (QGS) and 4D-MSPECT, in the processing of clinical patient data and phantom data were compared. The clinical MPS data of 148 consecutive patients were analysed using QGS and 4D-MSPECT to determine the end-diastolic volume, end-systolic volume and left ventricular ejection fraction. Patients were divided into groups based on gender, body mass index, heart size, stressor type and defect type. The AGATE dynamic heart phantom was used to provide reference values for the left ventricular ejection fraction. Although the correlations were excellent (correlation coefficients 0.886 to 0.980) for all parameters, significant differences (p < 0.001) were found between the systems. Bland-Altman plots indicated that 4D-MSPECT provided overall higher values of all parameters than QGS. These differences between the systems were not significant in patients with a small heart (end-diastolic volume <70 ml). Other clinical factors had no direct influence on the relationship. Additionally, the phantom data indicated good linear responses of both systems. The discrepancies between these software packages were clinically relevant, and influenced by heart size. The possibility of such discrepancies should be taken into account when a new quantitative software system is introduced, or when multiple software systems are used in the same institution. (orig.)

  5. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores.

    Science.gov (United States)

    Chikkagoudar, Satish; Wang, Kai; Li, Mingyao

    2011-05-26

    Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  6. High power electronics package: from modeling to implementation

    NARCIS (Netherlands)

    Yuan, C.A.; Kregting, R.; Ye, H.; Driel, W. van; Gielen, A.W.J.; Zhang, G.Q.

    2011-01-01

    Power electronics, such as high power RF components and high power LEDs, requires the combination of robust and reliable package structures, materials, and processes to guarantee their functional performance and lifetime. We started with the thermal and thermal-mechanical modeling of such component

  7. Artificial Intelligence Software Engineering (AISE) model

    Science.gov (United States)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  8. Artificial Intelligence Software Engineering (AISE) model

    Science.gov (United States)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  9. Linear latent variable models: the lava-package

    DEFF Research Database (Denmark)

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features...... are implemented including robust standard errors for clustered correlated data, multigroup analyses, non-linear parameter constraints, inference with incomplete data, maximum likelihood estimation with censored and binary observations, and instrumental variable estimators. In addition an extensive simulation...

  10. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  11. SOFTWARE DEVELOPMENT MODEL FOR ETHNOBILINGUAL DICTIONARIES

    Directory of Open Access Journals (Sweden)

    Melchora Morales-Sánchez

    2010-09-01

    Full Text Available A software development integral model for a dictionary to store and retrieve textual, visual, and most important, incorporating the audio of oral language. Taking into account both the characterization of indigenous cultural reality and the technical aspects of software construction. Such model consists of the next phases: context description, lexicographic design, computer design and multimedia, construction and tests of the application. There isn´t doubt about the influence of the contact of Spanish language with the variety of languages spoken throughout Latin-America causing the most diverse and extensive communications. Causing that in the interior of communities are interested in preserving their language tongue for people to identify themselves with their own roots and transmit this legacy to the next generations. The model its design to develop dictionary software with factors that are certain in indigenous reality as they are: low budget, functioning in computers with limited resources and human resources with minimum capabilities. And is exemplified with the development of a Spanish-chatino dictionary spoken in the town of Santos Reyes Nopala, Oaxaca in the coast region of Mexico.

  12. 'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.

    Science.gov (United States)

    Blume, Christine; Santhi, Nayantara; Schabus, Manuel

    2016-01-01

    For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).

  13. Applying reliability models to the maintenance of Space Shuttle software

    Science.gov (United States)

    Schneidewind, Norman F.

    1992-01-01

    Software reliability models provide the software manager with a powerful tool for predicting, controlling, and assessing the reliability of software during maintenance. We show how a reliability model can be effectively employed for reliability prediction and the development of maintenance strategies using the Space Shuttle Primary Avionics Software Subsystem as an example.

  14. PmagPy: Software Package for Paleomagnetic Data Analysis and Gateway to the Magnetics Information Consortium (MagIC) Database

    Science.gov (United States)

    Jonestrask, L.; Tauxe, L.; Shaar, R.; Jarboe, N.; Minnett, R.; Koppers, A. A. P.

    2014-12-01

    There are many data types and methods of analysis in rock and paleomagnetic investigations. The MagIC database (http://earthref.org/MAGIC) was designed to accommodate the vast majority of data used in such investigations. Yet getting data from the laboratory into the database, and visualizing and re-analyzing data downloaded from the database, makes special demands on data formatting. There are several recently published programming packages that deal with single types of data: demagnetization experiments (e.g., Lurcock et al., 2012), paleointensity experiments (e.g., Leonhardt et al., 2004), and FORC diagrams (e.g., Harrison et al., 2008). However, there is a need for a unified set of open source, cross-platform software that deals with the great variety of data types in a consistent way and facilitates importing data into the MagIC format, analyzing them and uploading them into the MagIC database. The PmagPy software package (http://earthref.org/PmagPy/cookbook/) comprises a such a comprehensive set of tools. It facilitates conversion of many laboratory formats into the common MagIC format and allows interpretation of demagnetization and Thellier-type experimental data. With some 175 programs and over 250 functions, it can be used to create a wide variety of plots and allows manipulation of downloaded data sets as well as preparation of new contributions for uploading to the MagIC database.

  15. The R Package bgmm : Mixture Modeling with Uncertain Knowledge

    Directory of Open Access Journals (Sweden)

    Przemys law Biecek

    2012-04-01

    Full Text Available Classical supervised learning enjoys the luxury of accessing the true known labels for the observations in a modeled dataset. Real life, however, poses an abundance of problems, where the labels are only partially defined, i.e., are uncertain and given only for a subsetof observations. Such partial labels can occur regardless of the knowledge source. For example, an experimental assessment of labels may have limited capacity and is prone to measurement errors. Also expert knowledge is often restricted to a specialized area and is thus unlikely to provide trustworthy labels for all observations in the dataset. Partially supervised mixture modeling is able to process such sparse and imprecise input. Here, we present an R package calledbgmm, which implements two partially supervised mixture modeling methods: soft-label and belief-based modeling. For completeness, we equipped the package also with the functionality of unsupervised, semi- and fully supervised mixture modeling. On real data we present the usage of bgmm for basic model-fitting in all modeling variants. The package can be applied also to selection of the best-fitting from a set of models with different component numbers or constraints on their structures. This functionality is presented on an artificial dataset, which can be simulated in bgmm from a distribution defined by a given model.

  16. Particle Data Management Software for 3DParticle Tracking Velocimetry and Related Applications – The Flowtracks Package

    Directory of Open Access Journals (Sweden)

    Yosef Meller

    2016-06-01

    Full Text Available The Particle Tracking Velocimetry (PTV community employs several formats of particle information such as position and velocity as function of time, i.e. trajectory data, as a result of diverging needs unmet by existing formats, and a number of different, mostly home-grown, codes for handling the data. Flowtracks is a Python package that provides a single code base for accessing different formats as a database, i.e. storing data and programmatically manipulating them using format-agnostic data structures. Furthermore, it offers an HDF5-based format that is fast and extensible, obviating the need for other formats. The package may be obtained from https://github.com/OpenPTV/postptv and used as-is by many fluid-dynamics labs, or with minor extensions adhering to a common interface, by researchers from other fields, such as biology and population tracking.

  17. An analysis of distribution transformer failure using the statistical package for the social sciences (SPSS software

    Directory of Open Access Journals (Sweden)

    María Gabriela Mago Ramos

    2012-08-01

    Full Text Available A methodology was developed for analysing faults in distribution transformers using the statistical package for social sciences (SPSS; it consisted of organising and creating of database regarding failed equipment, incorporating such data into the processing programme and converting all the information into numerical variables to be processed, thereby obtaining descriptive statistics and enabling factor and discriminant analysis. The research was based on information provided by companies in areas served by Corpoelec (Valencia, Venezuela and Codensa (Bogotá, Colombia.

  18. Saphire models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1997-02-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  19. BRIDGE: A Model for Modern Software Development Process to Cater the Present Software Crisis

    CERN Document Server

    Mandal, Ardhendu

    2011-01-01

    As hardware components are becoming cheaper and powerful day by day, the expected services from modern software are increasing like any thing. Developing such software has become extremely challenging. Not only the complexity, but also the developing of such software within the time constraints and budget has become the real challenge. Quality concern and maintainability are added flavour to the challenge. On stream, the requirements of the clients are changing so frequently that it has become extremely tough to manage these changes. More often, the clients are unhappy with the end product. Large, complex software projects are notoriously late to market, often exhibit quality problems, and don't always deliver on promised functionality. None of the existing models are helpful to cater the modern software crisis. Hence, a better modern software development process model to handle with the present software crisis is badly needed. This paper suggests a new software development process model, BRIDGE, to tackle pr...

  20. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...

  1. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing

    Science.gov (United States)

    Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna

    2016-01-01

    Background Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Objective Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. Methods We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. Results We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. Conclusion The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing. PMID

  2. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing.

    Directory of Open Access Journals (Sweden)

    Chiara Grasso

    Full Text Available Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis.Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results.We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1 by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36 and DNA from blood fractions of healthy people (DD study, N = 28, respectively.We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites.The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.

  3. DYNAMIC SOFTWARE AVAILABILITY MODEL WITH REJUVENATION

    National Research Council Canada - National Science Library

    Dohi, Tadashi; Okamura, Hiroyuki

    2016-01-01

    In this paper we consider an operational software system with multi-stage degradation levels due to software aging, and derive the optimal dynamic software rejuvenation policy maximizing the steady...

  4. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, F. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain)]. E-mail: fimerall@ull.es; Gonzalez-Manrique, S. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Karlsson, L. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Hernandez-Armas, J. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Aparicio, A. [Instituto de Astrofisica de Canarias, 38200 La Laguna, Tenerife (Spain); Departamento de Astrofisica, Universidad de La Laguna. Avenida. Astrofisico Francisco Sanchez s/n, 38071 La Laguna, Tenerife (Spain)

    2007-03-15

    Makrofol detectors are commonly used for long-term radon ({sup 222}Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm{sup -3}htrack{sup -1}cm{sup 2}, has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm{sup -3} and, in two of them, above 400Bqm{sup -3}. Further studies should be performed at those schools following the European Union recommendations about radon concentrations in

  5. MATCH - A Software Package for Robust Profile Matching Using S-Plus

    Directory of Open Access Journals (Sweden)

    Douglas P. Wiens

    2004-04-01

    Full Text Available This manual details the implementation of the profile matching techniques introduced in Robust Estimation of Air-Borne Particulate Matter (Wiens, Florence and Hiltz, Environmetrics, 2001 - included as an appendix. The program consists of a collection of functions written in S. It runs in S-Plus, including the student version. A graphical user interface is supplied for easy implementation by a user with only a passing familiarity with S-Plus. A description of the software is given, together with an extensive example of an analysis of a data set using the software. The software is available at http://www.stat.ualberta.ca/~wiens/publist.htm where it is linked to the listing for Wiens, Florence and Hiltz (2001.

  6. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO)...

  7. A Common Platform for Graphical Models in R: The gRbase Package

    Directory of Open Access Journals (Sweden)

    Claus Dethlefsen

    2005-12-01

    Full Text Available The gRbase package is intended to set the framework for computer packages for data analysis using graphical models. The gRbase package is developed for the open source language, R, and is available for several platforms. The package is intended to be widely extendible and flexible so that package developers may implement further types of graphical models using the available methods. The gRbase package consists of a set of S version 3 classes and associated methods for representing data and models. The package is linked to the dynamicGraph package (Badsberg 2005, an interactive graphical user interface for manipulating graphs.In this paper, we show how these building blocks can be combined and integrated with inference engines in the special cases of hierarchical loglinear models. We also illustrate how to extend the package to deal with other types of graphical models, in this case the graphical Gaussian models.

  8. Next Generation Lightweight Mirror Modeling Software

    Science.gov (United States)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  9. Next-Generation Lightweight Mirror Modeling Software

    Science.gov (United States)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  10. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing software

  11. Novel Software Reliability Estimation Model for Altering Paradigms of Software Engineering

    Directory of Open Access Journals (Sweden)

    Ritika Wason

    2012-05-01

    Full Text Available A number of different software engineering paradigms like Component-Based Software Engineering (CBSE, Autonomic Computing, Service-Oriented Computing (SOC, Fault-Tolerant Computing and many others are being researched currently. These paradigms denote a paradigm shift from the currently mainstream object-oriented paradigm and are altering the way we view, design, develop and exercise software. Though these paradigms indicate a major shift in the way we design and code software. However, we still rely on traditional reliability models for estimating the reliability of any of the above systems. This paper analyzes the underlying characteristics of these paradigms and proposes a novel Finite Automata Based Reliability model as a suitable model for estimating reliability of modern, complex, distributed and critical software applications. We further outline the basic framework for an intelligent, automata-based reliability model that can be used for accurate estimation of system reliability of software systems at any point in the software life cycle.

  12. Automated Environment Generation for Software Model Checking

    Science.gov (United States)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  13. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  14. Program SPACECAP: software for estimating animal density using spatially explicit capture-recapture models

    Science.gov (United States)

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas

    2012-01-01

    1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.

  15. THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE

    Science.gov (United States)

    The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...

  16. Area of ischemia assessed by physicians and software packages from myocardial perfusion scintigrams

    DEFF Research Database (Denmark)

    Edenbrandt, L.; Hoglund, P.; Frantz, S.

    2014-01-01

    Background: The European Society of Cardiology recommends that patients with > 10% area of ischemia should receive revascularization. We investigated inter-observer variability for the extent of ischemic defects reported by different physicians and by different software tools, and if inter-observ...

  17. Educational Administrative Software Packages: Alternatives to In-House Developed Systems.

    Science.gov (United States)

    Harris, Edward V.

    1985-01-01

    Historically, educational institutions have largely relied on in-house development of administrative software. However, the costs of skilled programmers and rapidly advancing technology are making in-house development too expensive. These and other factors are addressed and changes needed for future educational administrative computing support are…

  18. Development of PowerMap: a software package for statistical power calculation in neuroimaging studies.

    Science.gov (United States)

    Joyce, Karen E; Hayasaka, Satoru

    2012-10-01

    Although there are a number of statistical software tools for voxel-based massively univariate analysis of neuroimaging data, such as fMRI (functional MRI), PET (positron emission tomography), and VBM (voxel-based morphometry), very few software tools exist for power and sample size calculation for neuroimaging studies. Unlike typical biomedical studies, outcomes from neuroimaging studies are 3D images of correlated voxels, requiring a correction for massive multiple comparisons. Thus, a specialized power calculation tool is needed for planning neuroimaging studies. To facilitate this process, we developed a software tool specifically designed for neuroimaging data. The software tool, called PowerMap, implements theoretical power calculation algorithms based on non-central random field theory. It can also calculate power for statistical analyses with FDR (false discovery rate) corrections. This GUI (graphical user interface)-based tool enables neuroimaging researchers without advanced knowledge in imaging statistics to calculate power and sample size in the form of 3D images. In this paper, we provide an overview of the statistical framework behind the PowerMap tool. Three worked examples are also provided, a regression analysis, an ANOVA (analysis of variance), and a two-sample T-test, in order to demonstrate the study planning process with PowerMap. We envision that PowerMap will be a great aide for future neuroimaging research.

  19. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably.

    NARCIS (Netherlands)

    Ashraf, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Bach, K.S.; Hansen, H.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    OBJECTIVE: We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. METHODS: In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were

  20. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably.

    NARCIS (Netherlands)

    Ashraf, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Bach, K.S.; Hansen, H.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    OBJECTIVE: We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. METHODS: In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were

  1. Flexible adaptive paradigms for fMRI using a novel software package 'Brain Analysis in Real-Time' (BART.

    Directory of Open Access Journals (Sweden)

    Lydia Hellrung

    Full Text Available In this work we present a new open source software package offering a unified framework for the real-time adaptation of fMRI stimulation procedures. The software provides a straightforward setup and highly flexible approach to adapt fMRI paradigms while the experiment is running. The general framework comprises the inclusion of parameters from subject's compliance, such as directing gaze to visually presented stimuli and physiological fluctuations, like blood pressure or pulse. Additionally, this approach yields possibilities to investigate complex scientific questions, for example the influence of EEG rhythms or fMRI signals results themselves. To prove the concept of this approach, we used our software in a usability example for an fMRI experiment where the presentation of emotional pictures was dependent on the subject's gaze position. This can have a significant impact on the results. So far, if this is taken into account during fMRI data analysis, it is commonly done by the post-hoc removal of erroneous trials. Here, we propose an a priori adaptation of the paradigm during the experiment's runtime. Our fMRI findings clearly show the benefits of an adapted paradigm in terms of statistical power and higher effect sizes in emotion-related brain regions. This can be of special interest for all experiments with low statistical power due to a limited number of subjects, a limited amount of time, costs or available data to analyze, as is the case with real-time fMRI.

  2. An Assessment between Software Development Life Cycle Models of Software Engineering

    Directory of Open Access Journals (Sweden)

    Er. KESHAV VERMA

    2013-03-01

    Full Text Available This research deals with an essential and important subject in Digital world. It is related with the software managing processes that inspect the part of software development during the development models, which are called as software development life cycle. It shows five of the development models namely, waterfall, Iteration, V-shaped, spiral and Extreme programming. These models have advantages and disadvantages as well. So, the main objective of this research is to represent dissimilar models of software development and make a comparison among them to illustrate the features and defects of every model.

  3. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    Science.gov (United States)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  4. airGRteaching: an R-package designed for teaching hydrology with lumped hydrological models

    Science.gov (United States)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Andréassian, Vazken; Brigode, Pierre

    2017-04-01

    discharges, which are updated immediately (a calibration only needs a couple of seconds or less, a simulation is almost immediate). In addition, time series of internal variables, live-visualisation of internal variables evolution and performance statistics are provided. This interface allows for hands-on exercises that can include for instance the analysis by students of: - The effects of each parameter and model components on simulated discharge - The effects of objective functions based on high flows- or low flows-focused criteria on simulated discharge - The seasonality of the model components. References Winston Chang, Joe Cheng, JJ Allaire, Yihui Xie and Jonathan McPherson (2016). shiny: Web Application Framework for R. R package version 0.13.2. https://CRAN.R-project.org/package=shiny Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. Olivier Delaigue and Laurent Coron (2016). airGRteaching: Tools to simplify the use of the airGR hydrological package by students. R package version 0.0.1. https://webgr.irstea.fr/airGR/?lang=en R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  5. The Decoding Toolbox (TDT: A versatile software package for multivariate analyses of functional imaging data

    Directory of Open Access Journals (Sweden)

    Martin Nikolai Hebart

    2015-01-01

    Full Text Available The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns.

  6. Extracting Three Dimensional Surface Model of Human Kidney from the Visible Human Data Set using Free Software

    CERN Document Server

    P, Kirana Kumara

    2013-01-01

    Three dimensional digital model of a representative human kidney is needed for a surgical simulator that is capable of simulating a laparoscopic surgery involving kidney. Buying a three dimensional computer model of a representative human kidney, or reconstructing a human kidney from an image sequence using commercial software, both involve (sometimes significant amount of) money. In this paper, author has shown that one can obtain a three dimensional surface model of human kidney by making use of images from the Visible Human Data Set and a few free software packages (ImageJ, ITK-SNAP, and MeshLab in particular). Images from the Visible Human Data Set, and the software packages used here, both do not cost anything. Hence, the practice of extracting the geometry of a representative human kidney for free, as illustrated in the present work, could be a free alternative to the use of expensive commercial software or to the purchase of a digital model.

  7. Model-driven evolution of software architectures

    OpenAIRE

    Graaf, B.S.

    2007-01-01

    Software evolves continuously. As a consequence, software systems tend to become increasingly complex and, as such, more difficult to change. A software system's complexity is for a large part determined by its structure, or architecture. In this thesis we investigate how to reduce the risks and costs associated with the evolution of software architectures. Automation and abstraction are two basic software engineering techniques to deal with complexity. In this thesis we investigate the appli...

  8. Learning Software Component Model for Online Tutoring

    Directory of Open Access Journals (Sweden)

    K. Duraiswamy

    2012-01-01

    Full Text Available Problem statement: Web services are interface elements which allow applications to render functional services to requesting clients using open standard protocols. A lecture method combines both social association and urban processing as course design and delivery is termed as Interface Learning. Many Interface learning services is presenting through online. To make an online tutoring scheme more effective, the previous study used web services and application programs like instant messaging based on environments in which students reside. But the downside is that it is difficult to maintain the service request queues online. The services and data storage processes are inefficient. Approach: To overcome all the above issues, a Learning Software Component Model (LSCM framework is formed in the present study to build a component model based on communication services available on the network. In addition to this, the proposed software component modeled with Learning Object (LO aspects integrates the related sub hierarchical components with the main component object framework. Based on LSCM, training schedules are identified efficiently. Results: The proposed LSCM framework is experimented to show the performance improvement with the previous online tutoring scheme based on web services in terms of delivery report, maintenance of tutoring sessions and reliability. Conclusion: Compared to an existing online tutoring through web services, the proposed LSCM framework performance is 75% better in providing learning services to the providers.

  9. PLUTO - a software package using the 'maximum likelihood method' to fit plutonium in urine data to an excretion function

    Energy Technology Data Exchange (ETDEWEB)

    Riddell, A.E.; Britcher, A.R. (British Nuclear Fuels plc, Sellafield (United Kingdom))

    1994-01-01

    The PLUTO software package was developed at Sellafield to make optimum use of the analysis data from plutonium in urine samples in arriving at the best estimate of intake/uptake. The program prompts the assessor to enter the assessment parameters required to fit the data to the excretion function using the maximum likelihood method. A critical appraisal is given of the relative strengths and weaknesses of this assessment package. (author).

  10. SedWorks: A 3-D visualisation software package to help students link surface processes with depositional product

    Science.gov (United States)

    Jones, M. A.; Edwards, A.; Boulton, P.

    2010-12-01

    Helping students to develop a cognitive and intuitive feel for the different temporal and spatial scales of processes through which the rock record is assembled is a primary goal of geoscience teaching. SedWorks is a 3-D virtual geoscience world that integrates both quantitative modelling and field-based studies into one interactive package. The program aims to help students acquire scientific content, cultivate critical thinking skills, and hone their problem solving ability, while also providing them with the opportunity to practice the activities undertaken by professional earth scientists. SedWorks is built upon a game development platform used for constructing interactive 3-D applications. Initially the software has been developed for teaching the sedimentology component of a Geoscience degree and consists of a series of continents or land masses each possessing sedimentary environments which the students visit on virtual field trips. The students are able to interact with the software to collect virtual field data from both the modern environment and the stratigraphic record, and to formulate hypotheses based on their observations which they can test through virtual physical experimentation within the program. The program is modular in design in order to enhance its adaptability and to allow scientific content to be updated so that the knowledge and skills acquired are at the cutting edge. We will present an example module in which students undertake a virtual field study of a 2-km long stretch of a river to observe how sediment is transported and deposited. On entering the field area students are able to observe different bedforms in different parts of the river as they move up- and down-stream, as well as in and out of the river. As they explore, students discover ‘hot spots’ at which particular tools become available to them. This includes tools for measuring the physical parameters of the flow and sediment bed (e.g. velocity, depth, grain size, bed

  11. Selection of software for mechanical engineering undergraduates

    Energy Technology Data Exchange (ETDEWEB)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S., E-mail: ablicblau@swin.edu.au [Swinburne University of Technology, Faculty of Science Engineering and Technology, PO Box 218 Hawthorn, Victoria, Australia, 3122 (Australia)

    2016-07-12

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  12. Selection of software for mechanical engineering undergraduates

    Science.gov (United States)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S.

    2016-07-01

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  13. Perfusion CT measurements in healthy cervical spinal cord: feasibility and repeatability of the study as well as interchangeability of the perfusion estimates using two commercially available software packages

    Energy Technology Data Exchange (ETDEWEB)

    Bisdas, Sotirios [Johann Wolfgang University Hospital, Department of Radiology, Frankfurt (Germany); Medical University of South Carolina, Department of Radiology, Charleston, SC (United States); Johann Wolfgang Goethe University Hospital, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Rumboldt, Zoran; Deveikis, John; Spampinato, Maria Vittoria [Medical University of South Carolina, Department of Radiology, Charleston, SC (United States); Surlan, Katarina [Clinical Centre Ljubljana, Department of Clinical Radiology, Ljubljana (Slovenia); Koh, Tong San [Nanyang Technological University, School of Electrical and Electronic Engineering, Singapore (Singapore)

    2008-10-15

    Our purpose was to examine the feasibility and reproducibility of perfusion CT studies in the cervical spinal cord and the interchangeability of the values obtained by two post-processing methods. The perfusion CT studies of 40 patients with neck tumours were post-processed using two software packages (Software-1: deconvolution-based analysis with adiabatic tissue homogeneity approach and Software-2: maximum-slope-model with Patlak analysis). Eight patients were examined twice for assessing the reproducibility of the technique. Two neuroradiologists separately post-processed the images with two arterial input functions (AIFs): (1) the internal carotid artery (ICA) and (2) the vertebral artery (VA). Maps of blood flow (F) in ml/min/100 g, blood volume (V) in ml/100 g, mean transit time (MTT) in seconds (s) and permeability (PS) in ml/min/100 g were generated. The mean F, V, MTT and PS (Software-1) with VA-AIF and ICA-AIF were 8.93, 1.12, 16.3, 1.88 and 8.57, 1.19, 16.85 and 1.94, respectively. The reproducibility of the techniques was satisfactory, while the V and MTT values (in Software-1) and the F and V values (in Software-2) were dependent on the site of the AIF (p{>=}0.03 and p=0.02, respectively). The interobserver agreement was very good. The significant differences in measurements for a single patient (%) using Software-1/Software-2 were {+-}120%/110%, 90%/80%, 180% and 250%/130% for F, V, MTT and PS, respectively. Only F and PS values in the healthy tissue seemed to be interchangeable. Our results were in essential agreement with those derived by invasive measurements in animals. The cervical spine perfusion CT studies are feasible and reproducible. The present knowledge has to be validated with studies in spinal cord tumours in order to decide the usefulness of the perfusion CT in this field. (orig.)

  14. A Decomposition Software Package for the Decomposition of Long-Term Multi-Channel Electromyographic Signals

    Science.gov (United States)

    2007-11-02

    Luca, �A procedure for decomposing the myoelectric signal into its constituent action potentials,� IEEE Trans. Biomed. Eng., vol. BME-29, pp. 149...Abstract- The analysis of intramuscular EMG signals is based on the decomposition of the signals into basic units. Existing decomposition...software only supports short registration periods or single-channel recordings of signals of constant muscle effort. In this paper, we present the

  15. A Bisimulation-based Hierarchical Framework for Software Development Models

    Directory of Open Access Journals (Sweden)

    Ping Liang

    2013-08-01

    Full Text Available Software development models have been ripen since the emergence of software engineering, like waterfall model, V-model, spiral model, etc. To ensure the successful implementation of those models, various metrics for software products and development process have been developed along, like CMMI, software metrics, and process re-engineering, etc. The quality of software products and processes can be ensured in consistence as much as possible and the abstract integrity of a software product can be achieved. However, in reality, the maintenance of software products is still high and even higher along with software evolution due to the inconsistence occurred by changes and inherent errors of software products. It is better to build up a robust software product that can sustain changes as many as possible. Therefore, this paper proposes a process algebra based hierarchical framework to extract an abstract equivalent of deliverable at the end of phases of a software product from its software development models. The process algebra equivalent of the deliverable is developed hierarchically with the development of the software product, applying bi-simulation to test run the deliverable of phases to guarantee the consistence and integrity of the software development and product in a trivially mathematical way. And an algorithm is also given to carry out the assessment of the phase deliverable in process algebra.  

  16. Evaluating Educational Software Authoring Environments Using a Model Based on Software Engineering and Instructional Design Principles.

    Science.gov (United States)

    Collis, Betty A.; Gore, Marilyn

    1987-01-01

    This study suggests a new model for the evaluation of educational software authoring systems and applies this model to a particular authoring system, CSR Trainer 4000. The model used is based on an integrated set of software engineering and instructional design principles. (Author/LRW)

  17. The software package AIRY 7.0: new efficient deconvolution methods for post-adaptive optics data

    Science.gov (United States)

    La Camera, Andrea; Carbillet, Marcel; Prato, Marco; Boccacci, Patrizia; Bertero, Mario

    2016-07-01

    The Software Package AIRY (acronym of Astronomical Image Restoration in interferometrY) is a complete tool for the simulation and the deconvolution of astronomical images. The data can be a post-adaptive-optics image of a single dish telescope or a set of multiple images of a Fizeau interferometer. Written in IDL and freely downloadable, AIRY is a package of the CAOS Problem-Solving Environment. It is made of different modules, each one performing a specific task, e.g. simulation, deconvolution, and analysis of the data. In this paper we present the last version of AIRY containing a new optimized method for the deconvolution problem based on the scaled-gradient projection (SGP) algorithm extended with different regularization functions. Moreover a new module based on our multi-component method is added to AIRY. Finally we provide a few example projects describing our multi-step method recently developed for deblurring of high dynamic range images. By using AIRY v.7.0, users have a powerful tool for simulating the observations and for reconstructing their real data.

  18. “DETECTION ARTIFACTS” SOFTWARE PACKAGE: FUNCTIONAL CAPABILITIES AND PROSPECTS OF USING (ON THE EXAMPLE OF GEOARCHEOLOGICAL RESEARCH

    Directory of Open Access Journals (Sweden)

    Ye. P. Krupochkin

    2017-01-01

    Full Text Available Mathematical and scientific methods are highly significant in modern geoarcheological study. They contribute to the development of new computer technologies and their implementing in geoarcheological research in particular, decoding and photogrammetric processing of space images.The article focuses on the “Detection Artifacts”software package designed for thematic aerospace image decoding which is aimed at making the search automatic for various archeological sites, both natural and artificially created ones. The main attention is drawn to decoding of archeological sites using methods of morphological analysis and indicative decoding.Its work is based on two groups of methods of image computer processing: 1 an image enhancement method which is carried out with the help of spatial frequency filtration, and 2 a method of morphometric analysis. The methods of spatial frequency filtration can be used to solve two problems: information noise minimization and edge enhancement. To achieve the best results using the methods of spatial frequency filtration it is necessary to have all the information of relevance to the objects of searching.Searching for various archeological sites is not only photogrammetric task. In fact, this problem can be solved in the sphere of photogrammetry with the application of aerospace and computer methods. The authors stress the idea in order to avoid terminology ambiguity and confusion when describing the essence of the methods and processes. It should be noted that the work with the images must be executed in a strict sequence. First and foremost, photogrammetric processing – atmospheric correction, geometric adjustment, conversion and geo targeting should be implemented. And only after that one can proceed to decoding the information.When creating the software package a modular structure was applied that favorably affected the tasks being solved and corresponded to the conception of search for archaeological objects

  19. SAHM:VisTrails (Software for Assisted Habitat Modeling for VisTrails): training course

    Science.gov (United States)

    Holcombe, Tracy

    2014-01-01

    VisTrails is an open-source management and scientific workflow system designed to integrate the best of both scientific workflow and scientific visualization systems. Developers can extend the functionality of the VisTrails system by creating custom modules for bundled VisTrails packages. The Invasive Species Science Branch of the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) and the U.S. Department of the Interior’s North Central Climate Science Center have teamed up to develop and implement such a module—the Software for Assisted Habitat Modeling (SAHM). SAHM expedites habitat modeling and helps maintain a record of the various input data, the steps before and after processing, and the modeling options incorporated in the construction of an ecological response model. There are four main advantages to using the SAHM:VisTrails combined package for species distribution modeling: (1) formalization and tractable recording of the entire modeling process; (2) easier collaboration through a common modeling framework; (3) a user-friendly graphical interface to manage file input, model runs, and output; and (4) extensibility to incorporate future and additional modeling routines and tools. In order to meet increased interest in the SAHM:VisTrails package, the FORT offers a training course twice a year. The course includes a combination of lecture, hands-on work, and discussion. Please join us and other ecological modelers to learn the capabilities of the SAHM:VisTrails package.

  20. Aspect-Oriented Software Quality Model: The AOSQ Model

    Directory of Open Access Journals (Sweden)

    Pankaj Kumar

    2012-04-01

    Full Text Available Nowadays, software development has become more complex and dynamic; they are expected more flexible, scalable and reusable. Under the umbrella of aspect, Aspect-Oriented Software Development (AOSD is relatively a modern programming paradigm to improve modularity in software development. Using Aspect-Oriented Programming (AOP language to implements crosscutting concerns through the introduction of a new construct Aspect like Class is defined as a modular unit of crosscutting behavior that affect multiple classes into reusable modules. Several quality models to measure the quality of software are available in literature. However, keep on developing software, and acceptance of new environment (i.e. AOP under conditions that give rise to an issue of evolvability. After the evolution of system, we have to find out how the new system needs to be extensible? What is the configurable status? Is designed pattern stable for new environment and technology? How the new system is sustainable? The objective of this paper is to propose a new quality model for AOSD to integrating some new qualityattributes in AOSQUAMO Model based which is based on ISO/IEC 9126 Quality Model, is called AspectOriented Quality (AOSQ Model. Analytic Hierarchy Process (AHP is used to evaluate an improved hierarchical quality model for AOSD.

  1. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...... as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...

  2. SOFTWARE PACKAGE FOR SOLVING THE PROBLEMS OF ANALYSIS AND SYNTHESIS OF NETWORKED CONTROL SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. E. Emelyanov

    2015-01-01

    Full Text Available Summary. Modern control systems shall exchange data packets through the network channels. Such systems are called network management systems. One of the promising directions of development of network management systems is the use of common computer networks in the control loop for the exchange of information between elements of the system. Such a construction of control systems leads to new problems. So in the design and study of such systems need to combine different methods of scientific fields. First of all, it is the field of control theory and communication theory. However, not all the developer has full knowledge of these areas to the same extent. To solve engineering problems, in order to ensure the required quality of operation, developed methods of analysis and synthesis of networked control systems with data transmission over a channel with competing access methods. These techniques allow the calculation of probability-time characteristics of a stochastic process data channel with competing access methods to build transients considered control systems to calculate their qualitative characteristics, to determine the conditions of stability of network systems management and tuning parameters to optimize the digital controllers for the respective criterion. These techniques are the basis for the development of software. The proposed software system allows for the analysis and synthesis of the network through which the information data exchange. As well as to study the network system for a variety of laws regulation. Complex structure based on the principles of modularity, hierarchy and nesting modules to each other. Easy to use interface allows the software user numb special training.

  3. Two-Dimensional Gel Electrophoresis Image Analysis via Dedicated Software Packages.

    Science.gov (United States)

    Maurer, Martin H

    2016-01-01

    Analyzing two-dimensional gel electrophoretic images is supported by a number of freely and commercially available software. Although the respective program is highly specific, all the programs follow certain standardized algorithms. General steps are: (1) detecting and separating individual spots, (2) subtracting background, (3) creating a reference gel and (4) matching the spots to the reference gel, (5) modifying the reference gel, (6) normalizing the gel measurements for comparison, (7) calibrating for isoelectric point and molecular weight markers, and moreover, (8) constructing a database containing the measurement results and (9) comparing data by statistical and bioinformatic methods.

  4. Mixed Frequency Data Sampling Regression Models: The R Package midasr

    Directory of Open Access Journals (Sweden)

    Eric Ghysels

    2016-08-01

    Full Text Available When modeling economic relationships it is increasingly common to encounter data sampled at different frequencies. We introduce the R package midasr which enables estimating regression models with variables sampled at different frequencies within a MIDAS regression framework put forward in work by Ghysels, Santa-Clara, and Valkanov (2002. In this article we define a general autoregressive MIDAS regression model with multiple variables of different frequencies and show how it can be specified using the familiar R formula interface and estimated using various optimization methods chosen by the researcher. We discuss how to check the validity of the estimated model both in terms of numerical convergence and statistical adequacy of a chosen regression specification, how to perform model selection based on a information criterion, how to assess forecasting accuracy of the MIDAS regression model and how to obtain a forecast aggregation of different MIDAS regression models. We illustrate the capabilities of the package with a simulated MIDAS regression model and give two empirical examples of application of MIDAS regression.

  5. Package models and the information crisis of prebiotic evolution.

    Science.gov (United States)

    Silvestre, Daniel A M M; Fontanari, José F

    2008-05-21

    The coexistence between different types of templates has been the choice solution to the information crisis of prebiotic evolution, triggered by the finding that a single RNA-like template cannot carry enough information to code for any useful replicase. In principle, confining d distinct templates of length L in a package or protocell, whose survival depends on the coexistence of the templates it holds in, could resolve this crisis provided that d is made sufficiently large. Here we review the prototypical package model of Niesert et al. [1981. Origin of life between Scylla and Charybdis. J. Mol. Evol. 17, 348-353] which guarantees the greatest possible region of viability of the protocell population, and show that this model, and hence the entire package approach, does not resolve the information crisis. In particular, we show that the total information stored in a viable protocell (Ld) tends to a constant value that depends only on the spontaneous error rate per nucleotide of the template replication mechanism. As a result, an increase of d must be followed by a decrease of L, so that the net information gain is null.

  6. RECERTIFICATION OF THE MODEL 9977 RADIOACTIVE MATERIAL PACKAGING

    Energy Technology Data Exchange (ETDEWEB)

    Abramczyk, G.; Bellamy, S.; Loftin, B.; Nathan, S.

    2013-06-05

    The Model 9977 Packaging was initially issued a Certificate of Compliance (CoC) by the Department of Energy’s Office of Environmental Management (DOE-EM) for the transportation of radioactive material (RAM) in the Fall of 2007. This first CoC was for a single radioactive material and two packing configurations. In the five years since that time, seven Addendums have been written to the Safety Analysis Report for Packaging (SARP) and five Letter Amendments have been written that have authorized either new RAM contents or packing configurations, or both. This paper will discuss the process of updating the 9977 SARP to include all the contents and configurations, including the addition of a new content, and its submittal for recertification.

  7. dglars: An R Package to Estimate Sparse Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Luigi Augugliaro

    2014-09-01

    Full Text Available dglars is a publicly available R package that implements the method proposed in Augugliaro, Mineo, and Wit (2013, developed to study the sparse structure of a generalized linear model. This method, called dgLARS, is based on a differential geometrical extension of the least angle regression method proposed in Efron, Hastie, Johnstone, and Tibshirani (2004. The core of the dglars package consists of two algorithms implemented in Fortran 90 to efficiently compute the solution curve: a predictor-corrector algorithm, proposed in Augugliaro et al. (2013, and a cyclic coordinate descent algorithm, proposed in Augugliaro, Mineo, and Wit (2012. The latter algorithm, as shown here, is significantly faster than the predictor-corrector algorithm. For comparison purposes, we have implemented both algorithms.

  8. Model-driven evolution of software architectures

    NARCIS (Netherlands)

    Graaf, B.S.

    2007-01-01

    Software evolves continuously. As a consequence, software systems tend to become increasingly complex and, as such, more difficult to change. A software system's complexity is for a large part determined by its structure, or architecture. In this thesis we investigate how to reduce the risks and cos

  9. Informed-Proteomics: Open Source Software Package for Top-down Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jung Kap; Piehowski, Paul D.; Wilkins, Christopher S.; Zhou, Mowei; Mendoza, Joshua A.; Fujimoto, Grant M.; Gibbons, Bryson C.; Shaw, Jared B.; Shen, Yufeng; Shukla, Anil K.; Moore, Ronald J.; Liu, Tao; Petyuk, Vladislav A.; Tolic, Nikola; Pasa Tolic, Ljiljana; Smith, Richard D.; Payne, Samuel H.; Kim, Sangtae

    2017-08-07

    Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent need to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.

  10. Hierarchy-Based Team Software Process Simulation Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    According to the characteristic of Team Software Process (TSP), it adopts a hierarchy-based model combined discrete event model with system dynamics model. This model represents TSP as form of two levels, the inner level embodies the continuity of the software process, the outer embodies the software development process by phases, and the structure and principle of the model is explained in detail, then formalization description of the model is offered. At last, an example is presented to demonstrate the simulation process and result. This model can simulate team software process from various angles, supervise and predict the software process. Also it can make the management of software development become more scientific and improve the quality of software.

  11. CARBayes: An R Package for Bayesian Spatial Modeling with Conditional Autoregressive Priors

    Directory of Open Access Journals (Sweden)

    Duncan Lee

    2013-11-01

    Full Text Available Conditional autoregressive models are commonly used to represent spatial autocorrelation in data relating to a set of non-overlapping areal units, which arise in a wide variety of applications including agriculture, education, epidemiology and image analysis. Such models are typically specified in a hierarchical Bayesian framework, with inference based on Markov chain Monte Carlo (MCMC simulation. The most widely used software to fit such models is WinBUGS or OpenBUGS, but in this paper we introduce the R package CARBayes. The main advantage of CARBayes compared with the BUGS software is its ease of use, because: (1 the spatial adjacency information is easy to specify as a binary neighbourhood matrix; and (2 given the neighbourhood matrix the models can be implemented by a single function call in R. This paper outlines the general class of Bayesian hierarchical models that can be implemented in the CARBayes software, describes their implementation via MCMC simulation techniques, and illustrates their use with two worked examples in the fields of house price analysis and disease mapping.

  12. Fitting Additive Binomial Regression Models with the R Package blm

    Directory of Open Access Journals (Sweden)

    Stephanie Kovalchik

    2013-09-01

    Full Text Available The R package blm provides functions for fitting a family of additive regression models to binary data. The included models are the binomial linear model, in which all covariates have additive effects, and the linear-expit (lexpit model, which allows some covariates to have additive effects and other covariates to have logisitc effects. Additive binomial regression is a model of event probability, and the coefficients of linear terms estimate covariate-adjusted risk differences. Thus, in contrast to logistic regression, additive binomial regression puts focus on absolute risk and risk differences. In this paper, we give an overview of the methodology we have developed to fit the binomial linear and lexpit models to binary outcomes from cohort and population-based case-control studies. We illustrate the blm packages methods for additive model estimation, diagnostics, and inference with risk association analyses of a bladder cancer nested case-control study in the NIH-AARP Diet and Health Study.

  13. A QFD-based decision making model for computer-aided design software selection

    Directory of Open Access Journals (Sweden)

    Kanika Prasad

    2016-03-01

    Full Text Available With the progress in technology and innovation in product development, the contribution of computer- aided design (CAD software in the design and manufacture of parts/products is growing on significantly. Selection of an appropriate CAD software is not a trifling task as it involves analyzing the appositeness of the available software packages to the unique requirements of the organization. Existence of a large number of CAD software vendors, presence of discordance among different hardware and software systems, and dearth of technical knowledge and experience of the decision makers further complicate the selection procedure. Moreover, there are very few published research papers related to CAD software selection, and majority of them have either employed criteria weights computed utilizing subjective judgements of the end users or floundered to incorporate the voice of customers in the decision making process. Quality function deployment (QFD is a well-known technique for determining the relative importance of customers’ defined criteria for selection of any product or service. Therefore, this paper deals with design and development of a QFD-based decision making model in Visual BASIC 6.0 for selection of CAD software for manufacturing organizations. In order to demonstrate the applicability and potentiality of the developed model in the form of a software prototype, two illustrative examples are also provided.

  14. Transfer of computer software technology through workshops: The case of fish bioenergetics modeling

    Science.gov (United States)

    Johnson, B.L.

    1992-01-01

    A three-part program is proposed to promote the availability and use of computer software packages to fishery managers and researchers. The approach consists of journal articles that announce new technologies, technical reports that serve as user's guides, and hands-on workshops that provide direct instruction to new users. Workshops, which allow experienced users to directly instruct novices in software operation and application are important, but often neglected. The author's experience with organizing and conducting bioenergetics modeling workshops suggests the optimal workshop would take 2 days, have 10-15 participants, one computer for every two users, and one instructor for every 5-6 people.

  15. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  16. Programski paket za realizaciju procene profesionalnog rizika na radnom mestu / Software package for risk assessment in the workplace

    Directory of Open Access Journals (Sweden)

    Zoran Novaković

    2008-04-01

    Full Text Available Zakon o bezbednosti i zdravlju na radu donosi niz novih obaveza poslodavaca, među kojima se, po značaju i složenosti, izdvajaju aktivnosti vezane za izradu Akta o proceni rizika na svim radnim mestima. Inicijalizacija projekta izrade softverskog paketa za procenu rizika na radnom mestu i u radnoj okolini izvršena je na osnovu iskazane potrebe, proistekle iz značajnih zakonodavnih promena koje su donete u oblasti bezbednosti i zdravlja na radu. To predstavlja osnovu za dalju nadgradnju u smislu integralnog informatičkog rešenja za vođenje poslova bezbednosti i zdravlja na radu. / A new national Occupational and Safety Law brings many new obligations for employers and among them activities related to risk assessment procedures. A project of developing a software package for conducting a risk assessment procedure in the workplace and in the work environment was initiated, based on needs generated by significant legislation changes in the domain of occupational safety and health. All of previous should be a basis for further upgrading to the level of an integrated software solution in the domain of occupational safety and health.

  17. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Directory of Open Access Journals (Sweden)

    Philipp Thomas

    Full Text Available The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA, which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network

  18. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  19. The GEMPAK Barnes interactive objective map analysis scheme. [General Meteorological Software Package

    Science.gov (United States)

    Koch, S. E.; Kocin, P. J.; Desjardins, M.

    1983-01-01

    The analysis scheme and meteorological applications of the GEMPAK data analysis and display software system developed by NASA are described. The program was devised to permit objective, versatile, and practical analysis of satellite meteorological data using a minicomputer and a display system with graphics capability. A data area can be selected within the data file for the globe, and data-sparse regions can be avoided. Distances between observations and the nearest observation points are calculated in order to avoid errors when determining synoptic weather conditions. The Barnes (1973) successive correction method is employed to restore the amplitude of small yet resolvable wavelengths suppressed in an initial filtering pass. The rms deviation is then calculated in relation to available measured data. Examples are provided of treatment of VISSR data from the GOES satellite and a study of the impact of incorrect cloud height data on synoptic weather field analysis.

  20. Estimating Population Abundance Using Sightability Models: R SightabilityModel Package

    Directory of Open Access Journals (Sweden)

    John R. Fieberg

    2012-11-01

    Full Text Available Sightability models are binary logistic-regression models used to estimate and adjust for visibility bias in wildlife-population surveys (Steinhorst and Samuel 1989. Estimation proceeds in 2 stages: (1 Sightability trials are conducted with marked individuals, and logistic regression is used to estimate the probability of detection as a function of available covariates (e.g., visual obstruction, group size. (2 The fitted model is used to adjust counts (from future surveys for animals that were not observed. A modified Horvitz-Thompson estimator is used to estimate abundance: counts of observed animal groups are divided by their inclusion probabilites (determined by plot-level sampling probabilities and the detection probabilities estimated from stage 1. We provide a brief historical account of the approach, clarifying and documenting suggested modifications to the variance estimators originally proposed by Steinhorst and Samuel (1989. We then introduce a new R package, SightabilityModel, for estimating abundance using this technique. Lastly, we illustrate the software with a series of examples using data collected from moose (Alces alces in northeastern Minnesota and mountain goats (Oreamnos americanus in Washington State.

  1. Radiobiological modeling with MarCell software

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, J.S.; Jones, T.D. [Oak Ridge National Lab., TN (United States). Health Sciences Research Div.

    1999-01-01

    A nonlinear system of differential equations that models the bone marrow cellular kinetics associated with radiation injury, molecular repair, and compensatory cell proliferation has been extensively documented. Recently, that model has been implemented as MarCell, a user-friendly MS-DOS computer program that allows users with little knowledge of the original model to evaluate complex radiation exposure scenarios. The software allows modeling with the following radiations: tritium beta, 100 kVp X, 250 kVp X, 22 MV X, {sup 60}Co, {sup 137}Cs, 2 MeV electrons, triga neutrons, D-T neutrons, and 3 blends of mixed-field fission radiations. The possible cell lineages are stem, stroma, and leukemia/lymphoma, and the available species include mouse, rat, dog, sheep, swine, burro, and man. An attractive mathematical feature is that any protracted protocol can be expressed as an equivalent prompt dose for either the source used or for a reference, such as 250 kVp X rays or {sup 60}Co. Output from MarCell includes: risk of 30-day mortality; risk of cancer and leukemia based either on cytopenia or compensatory cell proliferation; cell survival plots as a function of time or dose; and 4-week recovery kinetics following treatment. In this article, the program`s applicability and ease of use are demonstrated by evaluating a medical total body irradiation protocol and a nuclear fallout scenario.

  2. gems: An R Package for Simulating from Disease Progression Models

    Directory of Open Access Journals (Sweden)

    Nello Blaser

    2015-03-01

    Full Text Available Mathematical models of disease progression predict disease outcomes and are useful epidemiological tools for planners and evaluators of health interventions. The R package gems is a tool that simulates disease progression in patients and predicts the effect of different interventions on patient outcome. Disease progression is represented by a series of events (e.g., diagnosis, treatment and death, displayed in a directed acyclic graph. The vertices correspond to disease states and the directed edges represent events. The package gems allows simulations based on a generalized multistate model that can be described by a directed acyclic graph with continuous transition-specific hazard functions. The user can specify an arbitrary hazard function and its parameters. The model includes parameter uncertainty, does not need to be a Markov model, and may take the history of previous events into account. Applications are not limited to the medical field and extend to other areas where multistate simulation is of interest. We provide a technical explanation of the multistate models used by gems, explain the functions of gems and their arguments, and show a sample application.

  3. A Thermodynamic Model for Genome Packaging in Hepatitis B Virus.

    Science.gov (United States)

    Kim, Jehoon; Wu, Jianzhong

    2015-10-20

    Understanding the fundamentals of genome packaging in viral capsids is important for finding effective antiviral strategies and for utilizing benign viral particles for gene therapy. While the structure of encapsidated genomic materials has been routinely characterized with experimental techniques such as cryo-electron microscopy and x-ray diffraction, much less is known about the molecular driving forces underlying genome assembly in an intracellular environment and its in vivo interactions with the capsid proteins. Here we study the thermodynamic basis of the pregenomic RNA encapsidation in human Hepatitis B virus in vivo using a coarse-grained molecular model that captures the essential components of nonspecific intermolecular interactions. The thermodynamic model is used to examine how the electrostatic interaction between the packaged RNA and the highly charged C-terminal domains (CTD) of capsid proteins regulate the nucleocapsid formation. The theoretical model predicts optimal RNA content in Hepatitis B virus nucleocapsids with different CTD lengths in good agreement with mutagenesis measurements, confirming the predominant role of electrostatic interactions and molecular excluded-volume effects in genome packaging. We find that the amount of encapsidated RNA is not linearly correlated with the net charge of CTD tails as suggested by earlier theoretical studies. Our thermodynamic analysis of the nucleocapsid structure and stability indicates that ∼10% of the CTD residues are free from complexation with RNA, resulting in partially exposed CTD tails. The thermodynamic model also predicts the free energy of complex formation between macromolecules, which corroborates experimental results for the impact of CTD truncation on the nucleocapsid stability.

  4. Hardware and software package for search, detection and first aid means delivery in rough terrain on basis of a three rotor unmanned aerial vehicle

    Directory of Open Access Journals (Sweden)

    Sergii FIRSOV

    2014-06-01

    Full Text Available The unmanned aerial vehicles are used for dangerous tasks solution. The search and detection of injured in rough terrain is one of them. Thus, vertical take-off unmanned aerial vehicles are of a special interest. A hardware and software package for the task solving is proposed in the article.

  5. Singularity of Some Software Reliability Models and Parameter Estimation Method

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out the conclusion from the fitting results of failure data of a software project, the SRES can recommend users “the most suitable model” as a software reliability measurement model. We believe that the SRES can overcome the inconsistency in applications of software reliability models well. We report investigation results of singularity and parameter estimation methods of experimental models in SRES.

  6. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  7. Evaluation of sequence alignments and oligonucleotide probes with respect to three-dimensional structure of ribosomal RNA using ARB software package

    Directory of Open Access Journals (Sweden)

    Meier Harald

    2006-05-01

    Full Text Available Abstract Background Availability of high-resolution RNA crystal structures for the 30S and 50S ribosomal subunits and the subsequent validation of comparative secondary structure models have prompted the biologists to use three-dimensional structure of ribosomal RNA (rRNA for evaluating sequence alignments of rRNA genes. Furthermore, the secondary and tertiary structural features of rRNA are highly useful and successfully employed in designing rRNA targeted oligonucleotide probes intended for in situ hybridization experiments. RNA3D, a program to combine sequence alignment information with three-dimensional structure of rRNA was developed. Integration into ARB software package, which is used extensively by the scientific community for phylogenetic analysis and molecular probe designing, has substantially extended the functionality of ARB software suite with 3D environment. Results Three-dimensional structure of rRNA is visualized in OpenGL 3D environment with the abilities to change the display and overlay information onto the molecule, dynamically. Phylogenetic information derived from the multiple sequence alignments can be overlaid onto the molecule structure in a real time. Superimposition of both statistical and non-statistical sequence associated information onto the rRNA 3D structure can be done using customizable color scheme, which is also applied to a textual sequence alignment for reference. Oligonucleotide probes designed by ARB probe design tools can be mapped onto the 3D structure along with the probe accessibility models for evaluation with respect to secondary and tertiary structural conformations of rRNA. Conclusion Visualization of three-dimensional structure of rRNA in an intuitive display provides the biologists with the greater possibilities to carry out structure based phylogenetic analysis. Coupled with secondary structure models of rRNA, RNA3D program aids in validating the sequence alignments of rRNA genes and evaluating

  8. Mathematical modeling of materially nonlinear problems in structural analyses, Part II: Application in contemporary software

    Directory of Open Access Journals (Sweden)

    Bonić Zoran

    2010-01-01

    Full Text Available The paper presents application of nonlinear material models in the software package Ansys. The development of the model theory is presented in the paper of the mathematical modeling of material nonlinear problems in structural analysis (part I - theoretical foundations, and here is described incremental-iterative procedure for solving problems of nonlinear material used by this package and an example of modeling of spread footing by using Bilinear-kinematics and Drucker-Prager mode was given. A comparative analysis of the results obtained by these modeling and experimental research of the author was made. Occurrence of the load level that corresponds to plastic deformation was noted, development of deformations with increasing load, as well as the distribution of dilatation in the footing was observed. Comparison of calculated and measured values of reinforcement dilatation shows their very good agreement.

  9. DYNSTALL: Subroutine package with a dynamic stall model

    Energy Technology Data Exchange (ETDEWEB)

    Bjoerck, Anders [Aeronautical Research Inst. of Sweden, Bromma (Sweden)

    2001-03-01

    A subroutine package, called DYNSTALL, for the calculation of 2D unsteady airfoil aerodynamics is described. The subroutines are written in FORTRAN. DYNSTALL is basically an implementation of the Beddoes-Leishman dynamic stall model. This model is a semi-empirical model for dynamic stall. It includes, however, also models for attached flow unsteady aerodynamics. It is complete in the sense that it treats attached flow as well as separated flow. Semi-empirical means that the model relies on empirically determined constants. Semi because the constants are constants in equations with some physical interpretation. It requires the input of 2D airfoil aerodynamic data via tables as function of angle of attack. The method is intended for use in an aeroelastic code with the aerodynamics solved by blade/element method. DYNSTALL was written to work for any 2D angles of attack relative to the airfoil, e.g. flow from the rear of an airfoil.

  10. Development of integrated software project planning model

    OpenAIRE

    Manalif, Ekananta; Capretz, Luiz Fernando; Ho, Danny

    2012-01-01

    As the most uncertain and complex project when compared to other types of projects, software development project is highly depend on the result of software project planning phase that helping project managers by predicting the project demands with respect to the budgeting, scheduling, and the allocation of resources. The two main activities in software project planning are effort estimation and risk assessment which has to be executed together because the accuracy of the effort estimation is ...

  11. FullSWOF: A free software package for the simulation of shallow water flows

    CERN Document Server

    Delestre, Olivier; James, Francois; Lucas, Carine; Laguerre, Christian; Cordier, Stephane

    2014-01-01

    Numerical simulations of flows are required for numerous applications, and are usually carried out using shallow water equations. We describe the FullSWOF software which is based on up-to-date finite volume methods and well-balanced schemes to solve this kind of equations. It consists of a set of open source C++ codes, freely available to the community, easy to use, and open for further development. Several features make FullSWOF particularly suitable for applications in hydrology: small water heights and wet-dry transitions are robustly handled, rainfall and infiltration are incorporated, and data from grid-based digital topographies can be used directly. A detailed mathematical description is given here, and the capabilities of FullSWOF are illustrated based on analytic solutions and datasets of real cases. The codes, available in 1D and 2D versions, have been validated on a large set of benchmark cases, which are available together with the download information and documentation at http://www.univ-orleans....

  12. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  13. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... structures, supporting actor involvement in the ecosystem, and (v) proper orchestration and governance of the ecosystem to promote and support the changes and the health of the ecosystem. Our work contributes to Net4Care, a platform to serve as the common platform in the software ecosystem under...

  14. SPOTting Model Parameters Using a Ready-Made Python Package.

    Science.gov (United States)

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2015-01-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  15. SPOTting Model Parameters Using a Ready-Made Python Package.

    Directory of Open Access Journals (Sweden)

    Tobias Houska

    Full Text Available The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool, an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI. We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  16. Technical Review Report for the Model 9975-96 Package Safety Analysis Report for Packaging (S-SARP-G-00003, Revision 0, January 2008)

    Energy Technology Data Exchange (ETDEWEB)

    West, M

    2009-05-22

    This Technical Review Report (TRR) documents the review, performed by the Lawrence Livermore National Laboratory (LLNL) Staff, at the request of the U.S. Department of Energy (DOE), on the Safety Analysis Report for Packaging, Model 9975, Revision 0, dated January 2008 (S-SARP-G-00003, the SARP). The review includes an evaluation of the SARP, with respect to the requirements specified in 10 CFR 71, and in International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1. The Model 9975-96 Package is a 35-gallon drum package design that has evolved from a family of packages designed by DOE contractors at the Savannah River Site. Earlier package designs, i.e., the Model 9965, the Model 9966, the Model 9967, and the Model 9968 Packagings, were originally designed and certified in the early 1980s. In the 1990s, updated package designs that incorporated design features consistent with the then newer safety requirements were proposed. The updated package designs at the time were the Model 9972, the Model 9973, the Model 9974, and the Model 9975 Packagings, respectively. The Model 9975 Package was certified by the Packaging Certification Program, under the Office of Safety Management and Operations. The safety analysis of the Model 9975-85 Packaging is documented in the Safety Analysis Report for Packaging, Model 9975, B(M)F-85, Revision 0, dated December 2003. The Model 9975-85 Package is certified by DOE Certificate of Compliance (CoC) package identification number, USA/9975/B(M)F-85, for the transportation of Type B quantities of uranium metal/oxide, {sup 238}Pu heat sources, plutonium/uranium metals, plutonium/uranium oxides, plutonium composites, plutonium/tantalum composites, {sup 238}Pu oxide/beryllium metal.

  17. A Machine Learning based Efficient Software Reusability Prediction Model for Java Based Object Oriented Software

    Directory of Open Access Journals (Sweden)

    Surbhi Maggo

    2014-01-01

    Full Text Available Software reuse refers to the development of new software systems with the likelihood of completely or partially using existing components or resources with or without modification. Reusability is the measure of the ease with which previously acquired concepts and objects can be used in new contexts. It is a promising strategy for improvements in software quality, productivity and maintainability as it provides for cost effective, reliable (with the consideration that prior testing and use has eliminated bugs and accelerated (reduced time to market development of the software products. In this paper we present an efficient automation model for the identification and evaluation of reusable software components to measure the reusability levels (high, medium or low of procedure oriented Java based (object oriented software systems. The presented model uses a metric framework for the functional analysis of the Object oriented software components that target essential attributes of reusability analysis also taking into consideration Maintainability Index to account for partial reuse. Further machine learning algorithm LMNN is explored to establish relationships between the functional attributes. The model works at functional level rather than at structural level. The system is implemented as a tool in Java and the performance of the automation tool developed is recorded using criteria like precision, recall, accuracy and error rate. The results gathered indicate that the model can be effectively used as an efficient, accurate, fast and economic model for the identification of procedure based reusable components from the existing inventory of software resources.

  18. Traceability for Model Driven, Software Product Line Engineering

    NARCIS (Netherlands)

    Anquetil, N.; Grammel, B.; Galvao Lourenco da Silva, I.; Noppen, J.A.R.; Shakil Khan, S.; Arboleda, H.; Rashid, A.; Garcia, A.

    2008-01-01

    Traceability is an important challenge for software organizations. This is true for traditional software development and even more so in new approaches that introduce more variety of artefacts such as Model Driven development or Software Product Lines. In this paper we look at some aspect of the int

  19. The production-distribution problem with order acceptance and package delivery: models and algorithm

    Directory of Open Access Journals (Sweden)

    Khalili Majid

    2016-01-01

    Full Text Available The production planning and distribution are among the most important decisions in the supply chain. Classically, in this problem, it is assumed that all orders have to produced and separately delivered; while, in practice, an order may be rejected if the cost that it brings to the supply chain exceeds its revenue. Moreover, orders can be delivered in a batch to reduce the related costs. This paper considers the production planning and distribution problem with order acceptance and package delivery to maximize the profit. At first, a new mathematical model based on mixed integer linear programming is developed. Using commercial optimization software, the model can optimally solve small or even medium sized instances. For large instances, a solution method, based on imperialist competitive algorithms, is also proposed. Using numerical experiments, the proposed model and algorithm are evaluated.

  20. VAR, SVAR and SVEC Models: Implementation Within R Package vars

    Directory of Open Access Journals (Sweden)

    Bernhard Pfaff

    2008-02-01

    Full Text Available The structure of the package vars and its implementation of vector autoregressive, structural vector autoregressive and structural vector error correction models are explained in this paper. In addition to the three cornerstone functions VAR(, SVAR( and SVEC( for estimating such models, functions for diagnostic testing, estimation of a restricted models, prediction, causality analysis, impulse response analysis and forecast error variance decomposition are provided too. It is further possible to convert vector error correction models into their level VAR representation. The different methods and functions are elucidated by employing a macroeconomic data set for Canada. However, the focus in this writing is on the implementation part rather than the usage of the tools at hand.

  1. USER STORY SOFTWARE ESTIMATION:A SIMPLIFICATION OF SOFTWARE ESTIMATION MODEL WITH DISTRIBUTED EXTREME PROGRAMMING ESTIMATION TECHNIQUE

    OpenAIRE

    Ridi Ferdiana; Paulus Insap Santoso; Lukito Edi Nugroho; Ahmad Ashari

    2011-01-01

    Software estimation is an area of software engineering concerned with the identification, classification and measurement of features of software that affect the cost of developing and sustaining computer programs [19]. Measuring the software through software estimation has purpose to know the complexity of the software, estimate the human resources, and get better visibility of execution and process model. There is a lot of software estimation that work sufficiently in certain conditions or s...

  2. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    Science.gov (United States)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results

  3. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model...... and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...

  4. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO......) that deals with this problem. We present the main concepts and rationales behind this notation and discuss a prototype and run-time environment that executes these models, and provides an API so that other parts of the software can be easily integrated. The core concepts of the ECNO seem to be stabilizing...

  5. A Comparison Between Five Models Of Software Engineering

    Directory of Open Access Journals (Sweden)

    Nabil Mohammed Ali Munassar

    2010-09-01

    Full Text Available This research deals with a vital and important issue in computer world. It is concerned with the software management processes that examine the area of software development through the development models, which are known as software development life cycle. It represents five of the development models namely, waterfall, Iteration, V-shaped, spiral and Extreme programming. These models have advantages and disadvantages as well. Therefore, the main objective of this research is to represent different models of software development and make a comparison between them to show the features and defects of each model.

  6. The purely functional software deployment model

    NARCIS (Netherlands)

    Dolstra, E.

    2006-01-01

    Software deployment is the set of activities related to getting software components to work on the machines of end users. It includes activities such as installation, upgrading, uninstallation, and so on. Many tools have been developed to support deployment, but they all have serious limitations wi

  7. Model-driven and software product line engineering

    CERN Document Server

    Royer, Jean-Claude

    2013-01-01

    Many approaches to creating Software Product Lines have emerged that are based on Model-Driven Engineering. This book introduces both Software Product Lines and Model-Driven Engineering, which have separate success stories in industry, and focuses on the practical combination of them. It describes the challenges and benefits of merging these two software development trends and provides the reader with a novel approach and practical mechanisms to improve software development productivity.The book is aimed at engineers and students who wish to understand and apply software product lines

  8. Continuous Time Structural Equation Modeling with R Package ctsem

    Directory of Open Access Journals (Sweden)

    Charles C. Driver

    2017-04-01

    Full Text Available We introduce ctsem, an R package for continuous time structural equation modeling of panel (N > 1 and time series (N = 1 data, using full information maximum likelihood. Most dynamic models (e.g., cross-lagged panel models in the social and behavioural sciences are discrete time models. An assumption of discrete time models is that time intervals between measurements are equal, and that all subjects were assessed at the same intervals. Violations of this assumption are often ignored due to the difficulty of accounting for varying time intervals, therefore parameter estimates can be biased and the time course of effects becomes ambiguous. By using stochastic differential equations to estimate an underlying continuous process, continuous time models allow for any pattern of measurement occasions. By interfacing to OpenMx, ctsem combines the flexible specification of structural equation models with the enhanced data gathering opportunities and improved estimation of continuous time models. ctsem can estimate relationships over time for multiple latent processes, measured by multiple noisy indicators with varying time intervals between observations. Within and between effects are estimated simultaneously by modeling both observed covariates and unobserved heterogeneity. Exogenous shocks with different shapes, group differences, higher order diffusion effects and oscillating processes can all be simply modeled. We first introduce and define continuous time models, then show how to specify and estimate a range of continuous time models using ctsem.

  9. Calculation of chemical equilibrium between aqueous solution and minerals: the EQ3/6 software package. [In FORTRAN extended 4. 6 for CDC6600 and 7600

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.

    1979-02-01

    The newly developed EQ/36 software package computes equilibrium models of aqueous geochemical systems. The package contains two principal programs: EQ3 performs distribution-of-species calculations for natural water compositions; EQ6 uses the results of EQ3 to predict the consequences of heating and cooling aqueous solutions and of irreversible reaction in rock--water systems. The programs are valuable for studying such phenomena as the formation of ore bodies, scaling and plugging in geothermal development, and the long-term disposal of nuclear waste. EQ3 and EQ6 are compared with such well-known geochemical codes as SOLMNEQ, WATEQ, REDEQL, MINEQL, and PATHI. The data base allows calculations in the temperature interval 0 to 350{sup 0}C, at either 1 atm-steam saturation pressures or a constant 500 bars. The activity coefficient approximations for aqueous solutes limit modeling to solutions of ionic strength less than about one molal. The mathematical derivations and numerical techniques used in EQ6 are presented in detail. The program uses the Newton--Raphson method to solve the governing equations of chemical equilibrium for a system of specified elemental composition at fixed temperature and pressure. Convergence is aided by optimizing starting estimates and by under-relaxation techniques. The minerals present in the stable phase assemblage are found by several empirical methods. Reaction path models may be generated by using this approach in conjunction with finite differences. This method is analogous to applying high-order predictor--corrector methods to integrate a corresponding set of ordinary differential equations, but avoids propagation of error (drift). 8 figures, 9 tables.

  10. Models and metrics for software management and engineering

    Science.gov (United States)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  11. Model Averaging Software for Dichotomous Dose Response Risk Estimation

    Directory of Open Access Journals (Sweden)

    Matthew W. Wheeler

    2008-02-01

    Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, fits the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulfills a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.

  12. SPOTting model parameters using a ready-made Python package

    Science.gov (United States)

    Houska, Tobias; Kraft, Philipp; Breuer, Lutz

    2015-04-01

    The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for

  13. Application of the Finite Elemental Analysis to Modeling Temperature Change of the Vaccine in an Insulated Packaging Container during Transport.

    Science.gov (United States)

    Ge, Changfeng; Cheng, Yujie; Shen, Yan

    2013-01-01

    This study demonstrated an attempt to predict temperatures of a perishable product such as vaccine inside an insulated packaging container during transport through finite element analysis (FEA) modeling. In order to use the standard FEA software for simulation, an equivalent heat conduction coefficient is proposed and calculated to describe the heat transfer of the air trapped inside the insulated packaging container. The three-dimensional, insulated packaging container is regarded as a combination of six panels, and the heat flow at each side panel is a one-dimension diffusion process. The transit-thermal analysis was applied to simulate the heat transition process from ambient environment to inside the container. Field measurements were carried out to collect the temperature during transport, and the collected data were compared to the FEA simulation results. Insulated packaging containers are used to transport temperature-sensitive products such as vaccine and other pharmaceutical products. The container is usually made of an extruded polystyrene foam filled with gel packs. World Health Organization guidelines recommend that all vaccines except oral polio vaccine be distributed in an environment where the temperature ranges between +2 to +8 °C. The primary areas of concern in designing the packaging for vaccine are how much of the foam thickness and gel packs should be used in order to keep the temperature in a desired range, and how to prevent the vaccine from exposure to freezing temperatures. This study uses numerical simulation to predict temperature change within an insulated packaging container in vaccine cold chain. It is our hope that this simulation will provide the vaccine industries with an alternative engineering tool to validate vaccine packaging and project thermal equilibrium within the insulated packaging container.

  14. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  15. OUTSHORE Maturity Model: Assistance for Software Offshore Outsourcing Decisions

    Science.gov (United States)

    Mäkiö, Juho; Betz, Stafanie; Oberweis, Andreas

    Offshore outsourcing software development (OOSD) is increasingly being used by the Software Industry. OOSD is a specific variant of Geographically Distributed Software Developmentdistributed software development (GDSD). Compared to the traditional mode of software development (i.e., in-house) GDSD is more edgy and puts at risk the attainment of the expected results. Although the failure of an offshore outsourcing software project may be caused by a variety of factors, one major complication is geographical distance. Consequently we argue that risk avoidance in outshore software development should be undertaken well in advance of the development launch. This could be done by testing the offshore outsourcing relevance of each software project and then the offshore outsourcing company involved. With this in mind we have developed the OUTSHORE Maturity Modeloutshore maturity model - OMM.

  16. Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models

    Science.gov (United States)

    Chu, A.

    2014-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.

  17. Technical Review Report for the Model 9978-96 Package Safety Analysis Report for Packaging (S-SARP-G-00002, Revision 1, March 2009)

    Energy Technology Data Exchange (ETDEWEB)

    West, M

    2009-03-06

    This Technical Review Report (TRR) documents the review, performed by Lawrence Livermore National Laboratory (LLNL) Staff, at the request of the Department of Energy (DOE), on the 'Safety Analysis Report for Packaging (SARP), Model 9978 B(M)F-96', Revision 1, March 2009 (S-SARP-G-00002). The Model 9978 Package complies with 10 CFR 71, and with 'Regulations for the Safe Transport of Radioactive Material-1996 Edition (As Amended, 2000)-Safety Requirements', International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1. The Model 9978 Packaging is designed, analyzed, fabricated, and tested in accordance with Section III of the American Society of Mechanical Engineers Boiler and Pressure Vessel Code (ASME B&PVC). The review presented in this TRR was performed using the methods outlined in Revision 3 of the DOE's 'Packaging Review Guide (PRG) for Reviewing Safety Analysis Reports for Packages'. The format of the SARP follows that specified in Revision 2 of the Nuclear Regulatory Commission's Regulatory Guide 7.9, i.e., 'Standard Format and Content of Part 71 Applications for Approval of Packages for Radioactive Material'. Although the two documents are similar in their content, they are not identical. Formatting differences have been noted in this TRR, where appropriate. The Model 9978 Packaging is a single containment package, using a 5-inch containment vessel (5CV). It uses a nominal 35-gallon drum package design. In comparison, the Model 9977 Packaging uses a 6-inch containment vessel (6CV). The Model 9977 and Model 9978 Packagings were developed concurrently, and they were referred to as the General Purpose Fissile Material Package, Version 1 (GPFP). Both packagings use General Plastics FR-3716 polyurethane foam as insulation and as impact limiters. The 5CV is used as the Primary Containment Vessel (PCV) in the Model 9975-96 Packaging. The Model 9975-96 Packaging also has the 6CV as its Secondary

  18. STXMPy: a new software package for automated region of interest selection and statistical analysis of XANES data

    Directory of Open Access Journals (Sweden)

    Grunze Michael

    2010-06-01

    Full Text Available Abstract Background Soft X-ray spectromicroscopy based absorption near-edge structure analysis, is a spectroscopic technique useful for investigating sample composition at a nanoscale of resolution. While the technique holds great promise for analysis of biological samples, current methodologies are challenged by a lack of automatic analysis software e. g. for selection of regions of interest and statistical comparisons of sample variability. Results We have implemented a set of functions and scripts in Python to provide a semiautomatic treatment of data obtained using scanning transmission X-ray microscopy. The toolkit includes a novel line-by-line absorption conversion and data filtering automatically identifying image components with significant absorption. Results are provided to the user by direct graphical output to the screen and by output images and data files, including the average and standard deviation of the X-ray absorption spectrum. Using isolated mouse melanosomes as a sample biological tissue, application of STXMPy in analysis of biological tissues is illustrated. Conclusion The STXMPy package allows both interactive and automated batch processing of scanning transmission X-ray microscopic data. It is open source, cross platform, and offers rapid script development using the interpreted Python language.

  19. An algebraic approach to modeling in software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Loegel, G.J. [Superconducting Super Collider Lab., Dallas, TX (United States)]|[Michigan Univ., Ann Arbor, MI (United States); Ravishankar, C.V. [Michigan Univ., Ann Arbor, MI (United States)

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ``computer science`` objects like abstract data types, but in practice software errors are often caused because ``real-world`` objects are improperly modeled. There is a large semantic gap between the customer`s objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form.

  20. Use of advanced modeling techniques to optimize thermal packaging designs.

    Science.gov (United States)

    Formato, Richard M; Potami, Raffaele; Ahmed, Iftekhar

    2010-01-01

    Through a detailed case study the authors demonstrate, for the first time, the capability of using advanced modeling techniques to correctly simulate the transient temperature response of a convective flow-based thermal shipper design. The objective of this case study was to demonstrate that simulation could be utilized to design a 2-inch-wall polyurethane (PUR) shipper to hold its product box temperature between 2 and 8 °C over the prescribed 96-h summer profile (product box is the portion of the shipper that is occupied by the payload). Results obtained from numerical simulation are in excellent agreement with empirical chamber data (within ±1 °C at all times), and geometrical locations of simulation maximum and minimum temperature match well with the corresponding chamber temperature measurements. Furthermore, a control simulation test case was run (results taken from identical product box locations) to compare the coupled conduction-convection model with a conduction-only model, which to date has been the state-of-the-art method. For the conduction-only simulation, all fluid elements were replaced with "solid" elements of identical size and assigned thermal properties of air. While results from the coupled thermal/fluid model closely correlated with the empirical data (±1 °C), the conduction-only model was unable to correctly capture the payload temperature trends, showing a sizeable error compared to empirical values (ΔT > 6 °C). A modeling technique capable of correctly capturing the thermal behavior of passively refrigerated shippers can be used to quickly evaluate and optimize new packaging designs. Such a capability provides a means to reduce the cost and required design time of shippers while simultaneously improving their performance. Another advantage comes from using thermal modeling (assuming a validated model is available) to predict the temperature distribution in a shipper that is exposed to ambient temperatures which were not bracketed

  1. glmulti: An R Package for Easy Automated Model Selection with (Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Vincent Calcagno

    2010-10-01

    Full Text Available We introduce glmulti, an R package for automated model selection and multi-model inference with glm and related functions. From a list of explanatory variables, the provided function glmulti builds all possible unique models involving these variables and, optionally, their pairwise interactions. Restrictions can be specified for candidate models, by excluding specific terms, enforcing marginality, or controlling model complexity. Models are fitted with standard R functions like glm. The n best models and their support (e.g., (QAIC, (QAICc, or BIC are returned, allowing model selection and multi-model inference through standard R functions. The package is optimized for large candidate sets by avoiding memory limitation, facilitating parallelization and providing, in addition to exhaustive screening, a compiled genetic algorithm method. This article briefly presents the statistical framework and introduces the package, with applications to simulated and real data.

  2. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories. Objective: In the

  3. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories.Objective: In the

  4. The extraction of physical quantities of the processor using the LabVIEW software package

    Directory of Open Access Journals (Sweden)

    Stefan Koprda

    2014-05-01

    Full Text Available he article presents the issues of modelling and simulation in the graphic environment LabVIEW from the firm National Instrument. The possibility to simulate real processes offers many advantages to designers and advance designers from various spheres, such as time saving and costs minimization. The paper deal to create a block diagram in the environment LabVIEW which will show the entry about the temperature of the processor and it will be possible to use it, we had to find an appropriate way wh ich will allow it and will be usable for a great number of computer equipments.

  5. A flexible open-source toolbox for robust end-member modelling analysis - The R-package EMMAgeo

    Science.gov (United States)

    Dietze, Michael; Dietze, Elisabeth

    2013-04-01

    Interpreting geomorphological and sedimentological processes from grain-size data in environmental archives typically runs into problems when source- and process-related grain-size distributions become mixed during deposition. A powerful approach to overcome this ambiguity is to statistically "unmix" the samples. Typical algorithms use eigenspace decomposition and techniques of dimension reduction. This contribution presents a package for the free statistical software R. Some of the great advantages of R and R-packages are the open code structure, flexibility and low programming effort. The package contains a series of flexible, ready-to-use functions to perform different tasks of data tests, preparation, modelling and visualisation. The package originated from a recently presented Matlab-based end-member modelling algorithm (Dietze et al., 2012, SedGeol). It supports simple modelling of grain-size end-member loadings and scores (eigenspace extraction, factor rotation, data scaling, non-negative least squares solving) along with several measures of model quality. The package further provides preprocessing tools (e.g. grain-size scale conversions, tests of data structure, weight factor limit inference, determination of minimum, optimum and maximum number of meaningful end-members) and allows to model data sets with artificial or user-defined end-member loadings. EMMAgeo also supports inferring uncertainty estimation from a series of plausible model runs and the determination of robust end-members. The contribution presents important package functions, thereby illustrating how large data sets of artificial and natural grain-size samples from different depositional environments can be analysed to infer quantified process-related proxies.

  6. A Software Service Framework Model Based on Agent

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents an agent-based software service framework model called ASF, and definesthe basic concepts and structure of ASF model. It also describes the management and process mechanismsin ASF model.

  7. Modeling Virtual Meetings within Software Engineering Environment

    Directory of Open Access Journals (Sweden)

    Dr. Aiman Turani

    2014-04-01

    Full Text Available It is a common scenario to see project’s stakeholders, such as managers, team leaders, and developers carrying out their meeting in the online environment without a suitable preparation and facilitation For instance, stakeholders engaging in negotiation sessions trying to communicate system requirements in the virtual environment might face requirements misunderstanding which in turn might cause a whole project to fail. Usually a meeting agenda and design is implicit in the facilitator’s head. Conducting such meetings without obvious structure would potentially lead to various problems such as no one seemed to be in charge? Or there was no clear reason to meet or no agenda etc. In this paper, we are presenting a general framework to model group-based activities and meetings within software engineering field in a simplified and formal manner. Traditionally, managers submit their web-based group meeting information in a form of text-based instructions [1]. Then a group facilitator or chairperson will lead the group throughout the meeting to achieve the desired objectives. These types of meeting are relatively easy to manage in face-to-face environment where web based meeting in the other hand, is more challenging to facilitate and manage. Therefore, more and more specialized tools are immerging to manage and facilitate such meetings. For instance, Adobe Connect [2] is tool for facilitating web-based meetings. These tools usually allow facilitators to organize and prepare the meeting floor by inserting specific collaboration components such as chat, whiteboard, voting, etc. Then during the meeting the facilitator guides participants using the video or text component. This usually leads to undesirable outputs due to the lack of a clear structure or agenda in addition to the virtual distance that weakens the communication. In this paper a two level of modeling views are proposed, the static view and the dynamic view. The static view mainly

  8. PtProcess: An R Package for Modelling Marked Point Processes Indexed by Time

    Directory of Open Access Journals (Sweden)

    David Harte

    2010-10-01

    Full Text Available This paper describes the package PtProcess which uses the R statistical language. The package provides a unified approach to fitting and simulating a wide variety of temporal point process or temporal marked point process models. The models are specified by an intensity function which is conditional on the history of the process. The user needs to provide routines for calculating the conditional intensity function. Then the package enables one to carry out maximum likelihood fitting, goodness of fit testing, simulation and comparison of models. The package includes the routines for the conditional intensity functions for a variety of standard point process models. The package is intended to simplify the fitting of point process models indexed by time in much the same way as generalized linear model programs have simplified the fitting of various linear models. The primary examples used in this paper are earthquake sequences but the package is intended to have a much wider applicability.

  9. Software Architecture for Modeling and Simulation of Underwater Acoustic Information Systems

    Institute of Scientific and Technical Information of China (English)

    WANG Xi-min; CAI Zhi-ming

    2009-01-01

    The simulation of underwater acoustic information flow is an important way to research sonar performance and its engagement effectiveness in the ocean environment. This paper analyzes the significance of modeling an open and sophisticated simulation software architecture by object-oriented method, and introduces the modeling processes and expression method of simulation architecture. According to the requirements of simulation system and the underwater acoustic information flow, the logical architecture of simulation software system is modeled by the object-oriented method. A use-case view captured the system requirements. The logical view shows the logical architecture of software system. The simulation software is decomposed into the loose coupling constituent parts by layering and partitioning the packages for maintainability. The design patterns enabled the simulation software to have good expansibility and reusability. The simulation system involving multi-targets and multi-sonar is developed based on the architecture model. Practices show that the model meets the needs for simulating an open and sophisticated system.

  10. Fullrmc, a rigid body Reverse Monte Carlo modeling package enabled with machine learning and artificial intelligence.

    Science.gov (United States)

    Aoun, Bachir

    2016-05-01

    A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group.

  11. The Computer-Aided Analytic Process Model. Operations Handbook for the Analytic Process Model Demonstration Package

    Science.gov (United States)

    1986-01-01

    Research Note 86-06 THE COMPUTER-AIDED ANALYTIC PROCESS MODEL : OPERATIONS HANDBOOK FOR THE ANALYTIC PROCESS MODEL DE ONSTRATION PACKAGE Ronald G...ic Process Model ; Operations Handbook; Tutorial; Apple; Systems Taxonomy Mod--l; Training System; Bradl1ey infantry Fighting * Vehicle; BIFV...8217. . . . . . . .. . . . . . . . . . . . . . . . * - ~ . - - * m- .. . . . . . . item 20. Abstract -continued companion volume-- "The Analytic Process Model for

  12. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  13. Distributed Software Development Modelling and Control Framework

    Directory of Open Access Journals (Sweden)

    Yi Feng

    2012-10-01

    Full Text Available With the rapid progress of internet technology, more and more software projects adopt e-development tofacilitate the software development process in a world-wide context. However, distributed softwaredevelopment activity itself is a complex orchestration. It involves many people working together without thebarrier of time and space difference. Therefore, how to efficiently monitor and control software edevelopmentin a global perspective becomes an important issue for any internet-based softwaredevelopment project. In this paper, we present a novel approach to tackle this crucial issue by means ofcontrolling e-development process, collaborative task progress and communication quality. Meanwhile, wealso present our e-development supporting environment prototype: Caribou, to demonstrate the viability ofour approach.

  14. IMPROVED SOFTWARE QUALITY ASSURANCE TECHNIQUES USING SAFE GROWTH MODEL

    Directory of Open Access Journals (Sweden)

    M.Sangeetha

    2010-09-01

    Full Text Available In our lives are governed by large, complex systems with increasingly complex software, and the safety, security, and reliability of these systems has become a major concern. As the software in today’ssystems grows larger, it has more defects, and these defects adversely affect the safety, security, and reliability of the systems. Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, andmaintenance of software. Software divides into two pieces: internal and external quality characteristics.External quality characteristics are those parts of a product that face its users, where internal quality characteristics are those that do not.Quality is conformance to product requirements and should be free. This research concerns the role of software Quality. Software reliability is an important facet of software quality. It is the probability of failure-freeoperation of a computer program in a specified environment for a specified time. In software reliability modeling, the parameters of the model are typically estimated from the test data of the corresponding component. However, the widely used point estimatorsare subject to random variations in the data, resulting in uncertainties in these estimated parameters. This research describes a new approach to the problem of software testing. The approach is based on Bayesian graphical models and presents formal mechanisms forthe logical structuring of the software testing problem, the probabilistic and statistical treatment of the uncertainties to be addressed, the test design and analysis process, and the incorporation and implication of test results. Once constructed, the models produced are dynamic representations of the software testingproblem. It explains need of the common test-and-fix software quality strategy is no longer adequate, and characterizes the properties of the quality strategy.

  15. Open Source Software Reliability Growth Model by Considering Change- Point

    Directory of Open Access Journals (Sweden)

    Mashaallah Basirzadeh

    2012-01-01

    Full Text Available The modeling technique for Software Reliability is reaching its prosperity. Software reliability growth models have been used extensively for closed source software. The design and development of open source software (OSS is different from closed source software. We observed some basic characteristics for open source software like (i more instructions execution and code coverage taking place with respect to time, (ii release early, release often (iii frequent addition of patches (iv heterogeneity in fault density and effort expenditure (v Frequent release activities seem to have changed the bug dynamics significantly (vi Bug reporting on bug tracking system drastically increases and decreases. Due to this reason bug reported on bug tracking system keeps an irregular state and fluctuations. Therefore, fault detection/removal process can not be smooth and may be changed at some time point called change-point. In this paper, an instructions executed dependent software reliability growth model has been developed by considering change-point in order to cater diverse and huge user profile, irregular state of bug tracking system and heterogeneity in fault distribution. We have analyzed actual software failure count data to show numerical examples of software reliability assessment for the OSS. We also compare our model with the conventional in terms of goodness-of-fit for actual data. We have shown that the proposed model can assist improvement of quality for OSS systems developed under the open source project.

  16. Qualitative vs. quantitative software process simulation modelling: conversion and comparison

    OpenAIRE

    Zhang, He; Kitchenham, Barbara; Jeffery, Ross

    2009-01-01

    peer-reviewed Software Process Simulation Modeling (SPSM) research has increased in the past two decades. However, most of these models are quantitative, which require detailed understanding and accurate measurement. As the continuous work to our previous studies in qualitative modeling of software process, this paper aims to investigate the structure equivalence and model conversion between quantitative and qualitative process modeling, and to compare the characteristics and performance o...

  17. EQPT, a data file preprocessor for the EQ3/6 software package: User`s guide and related documentation (Version 7.0); Part 2

    Energy Technology Data Exchange (ETDEWEB)

    Daveler, S.A.; Wolery, T.J.

    1992-12-17

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.

  18. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    Science.gov (United States)

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  19. AGING PERFORMANCE OF MODEL 9975 PACKAGE FLUOROELASTOMER O-RINGS

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, E.; Daugherty, W.; Skidmore, E.; Dunn, K.; Fisher, D.

    2011-05-31

    The influence of temperature and radiation on Viton{reg_sign} GLT and GLT-S fluoroelastomer O-rings is an ongoing research focus at the Savannah River National Laboratory. The O-rings are credited for leaktight containment in the Model 9975 shipping package used for transportation of plutonium-bearing materials. At the Savannah River Site, the Model 9975 packages are being used for interim storage. Primary research efforts have focused on surveillance of O-rings from actual packages, leak testing of seals at bounding aging conditions and the effect of aging temperature on compression stress relaxation behavior, with the goal of service life prediction for long-term storage conditions. Recently, an additional effort to evaluate the effect of aging temperature on the oxidation of the materials has begun. Degradation in the mechanical properties of elastomers is directly related to the oxidation of the polymer. Sensitive measurements of the oxidation rate can be performed in a more timely manner than waiting for a measurable change in mechanical properties, especially at service temperatures. Measuring the oxidation rate therefore provides a means to validate the assumption that the degradation mechanisms(s) do not change from the elevated temperatures used for accelerated aging and the lower service temperatures. Monitoring the amount of oxygen uptake by the material over time at various temperatures can provide increased confidence in lifetime predictions. Preliminary oxygen consumption analysis of a Viton GLT-based fluoroelastomer compound (Parker V0835-75) using an Oxzilla II differential oxygen analyzer in the temperature range of 40-120 C was performed. Early data suggests oxygen consumption rates may level off within the first 100,000 hours (10-12 years) at 40 C and that sharp changes in the degradation mechanism (stress-relaxation) are not expected over the temperature range examined. This is consistent with the known long-term heat aging resistance of

  20. The R Package threg to Implement Threshold Regression Models

    Directory of Open Access Journals (Sweden)

    Tao Xiao

    2015-08-01

    This new package includes four functions: threg, and the methods hr, predict and plot for threg objects returned by threg. The threg function is the model-fitting function which is used to calculate regression coefficient estimates, asymptotic standard errors and p values. The hr method for threg objects is the hazard-ratio calculation function which provides the estimates of hazard ratios at selected time points for specified scenarios (based on given categories or value settings of covariates. The predict method for threg objects is used for prediction. And the plot method for threg objects provides plots for curves of estimated hazard functions, survival functions and probability density functions of the first-hitting-time; function curves corresponding to different scenarios can be overlaid in the same plot for comparison to give additional research insights.

  1. Modeling TCP/IP software implementation performance and its application for software routers

    OpenAIRE

    Lepe Aldama, Oscar Iván

    2002-01-01

    Existen numerosos trabajos que estudian o tratan la realización software de los protocolos de comunicaciones para el acceso a la Internet-TCP/IP. Sin embargo, no conocemos ninguno que modele de forma general y precisa la ejecución de este software.La presente tesis aporta una caracterización detallada de la ejecución de la realización software de los mencionados protocolos sobre un computador personal y bajo un sistema operativo UNIX. Esta caracterización muestra cómo varía el rendimiento del...

  2. A comparison of six software packages for evaluation of solid lung nodules using semi-automated volumetry: What is the minimum increase in size to detect growth in repeated CT examinations

    Energy Technology Data Exchange (ETDEWEB)

    Hoop, Bartjan de [University Medical Center, Department of Radiology, Utrecht (Netherlands); University Medical Center, Heidelberglaan 100, GA, Utrecht (Netherlands); Gietema, Hester; Prokop, Mathias [University Medical Center, Department of Radiology, Utrecht (Netherlands); Ginneken, Bram van [University Medical Center, Image Sciences Institute, Utrecht (Netherlands); Zanen, Pieter [University Medical Center, Department of Pulmonology, Utrecht (Netherlands); Groenewegen, Gerard [University Medical Center, Department of Oncology, Utrecht (Netherlands)

    2009-04-15

    We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules {>=}8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages. (orig.)

  3. FLIMX: A Software Package to Determine and Analyze the Fluorescence Lifetime in Time-Resolved Fluorescence Data from the Human Eye

    Science.gov (United States)

    Klemm, Matthias; Schweitzer, Dietrich; Peters, Sven; Sauer, Lydia; Hammer, Martin; Haueisen, Jens

    2015-01-01

    Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique for measuring the in vivo autofluorescence intensity decays generated by endogenous fluorophores in the ocular fundus. Here, we present a software package called FLIM eXplorer (FLIMX) for analyzing FLIO data. Specifically, we introduce a new adaptive binning approach as an optimal tradeoff between the spatial resolution and the number of photons required per pixel. We also expand existing decay models (multi-exponential, stretched exponential, spectral global analysis, incomplete decay) to account for the layered structure of the eye and present a method to correct for the influence of the crystalline lens fluorescence on the retina fluorescence. Subsequently, the Holm-Bonferroni method is applied to FLIO measurements to allow for group comparisons between patients and controls on the basis of fluorescence lifetime parameters. The performance of the new approaches was evaluated in five experiments. Specifically, we evaluated static and adaptive binning in a diabetes mellitus patient, we compared the different decay models in a healthy volunteer and performed a group comparison between diabetes patients and controls. An overview of the visualization capabilities and a comparison of static and adaptive binning is shown for a patient with macular hole. FLIMX’s applicability to fluorescence lifetime imaging microscopy is shown in the ganglion cell layer of a porcine retina sample, obtained by a laser scanning microscope using two-photon excitation. PMID:26192624

  4. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent...

  5. The review of the modeling methods and numerical analysis software for nanotechnology in material science

    Directory of Open Access Journals (Sweden)

    SMIRNOV Vladimir Alexeevich

    2014-10-01

    Full Text Available Due to the high demand for building materials with universal set of roperties which extend their application area the research efforts are focusing on nanotechnology in material science. The rational combination of theoretical studies, mathematical modeling and simulation can favour reduced resource and time consumption when nanomodified materials are being developed. The development of composite material is based on the principles of system analysis which provides for the necessity of criteria determination and further classification of modeling methods. In this work the criteria of spatial scale, dominant type of interaction and heterogeneity are used for such classification. The presented classification became a framework for analysis of methods and software which can be applied to the development of building materials. For each of selected spatial levels - from atomistic one to macrostructural level of constructional coarsegrained composite – existing theories, modeling algorithms and tools have been considered. At the level of macrostructure which is formed under influence of gravity and exterior forces one can apply probabilistic and geometrical methods to study obtained structure. The existing models are suitable for packing density analysis and solution of percolation problems at the macroscopic level, but there are still no software tools which could be applied in nanotechnology to carry out systematic investigations. At the microstructure level it’s possible to use particle method along with probabilistic and statistical methods to explore structure formation but available software tools are partially suitable for numerical analysis of microstructure models. Therefore, modeling of the microstructure is rather complicated; the model has to include potential of pairwise interaction. After the model has been constructed and parameters of pairwise potential have been determined, many software packages for solution of ordinary

  6. CT and MR perfusion can discriminate severe cerebral hypoperfusion from perfusion absence: evaluation of different commercial software packages by using digital phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Uwano, Ikuko; Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Advanced Medical Research Center, Morioka (Japan); Christensen, Soren [University of Melbourne, Royal Melbourne Hospital, Departments of Neurology and Radiology, Victoria (Australia); Oestergaard, Leif [Aarhus University Hospital, Department of Neuroradiology, Center for Functionally Integrative Neuroscience, DK, Aarhus C (Denmark); Ogasawara, Kuniaki; Ogawa, Akira [Iwate Medical University, Department of Neurosurgery, Morioka (Japan)

    2012-05-15

    Computed tomography perfusion (CTP) and magnetic resonance perfusion (MRP) are expected to be usable for ancillary tests of brain death by detection of complete absence of cerebral perfusion; however, the detection limit of hypoperfusion has not been determined. Hence, we examined whether commercial software can visualize very low cerebral blood flow (CBF) and cerebral blood volume (CBV) by creating and using digital phantoms. Digital phantoms simulating 0-4% of normal CBF (60 mL/100 g/min) and CBV (4 mL/100 g/min) were analyzed by ten software packages of CT and MRI manufacturers. Region-of-interest measurements were performed to determine whether there was a significant difference between areas of 0% and areas of 1-4% of normal flow. The CTP software detected hypoperfusion down to 2-3% in CBF and 2% in CBV, while the MRP software detected that of 1-3% in CBF and 1-4% in CBV, although the lower limits varied among software packages. CTP and MRP can detect the difference between profound hypoperfusion of <5% from that of 0% in digital phantoms, suggesting their potential efficacy for assessing brain death. (orig.)

  7. An evaluation of the psychometric properties of the Purdue Pharmacist Directive Guidance Scale using SPSS and R software packages.

    Science.gov (United States)

    Marr-Lyon, Lisa R; Gupchup, Gireesh V; Anderson, Joe R

    2012-01-01

    The Purdue Pharmacist Directive Guidance (PPDG) Scale was developed to assess patients' perceptions of the level of pharmacist-provided (1) instruction and (2) feedback and goal-setting-2 aspects of pharmaceutical care. Calculations of its psychometric properties stemming from SPSS and R were similar, but distinct differences were apparent. Using SPSS and R software packages, researchers aimed to examine the construct validity of the PPDG using a higher order factoring procedure; in tandem, McDonald's omega and Cronbach's alpha were calculated as means of reliability analyses. Ninety-nine patients with either type I or type II diabetes, aged 18 years or older, able to read and write English, and who could provide written-informed consent participated in the study. Data were collected in 8 community pharmacies in New Mexico. Using R, (1) a principal axis factor analysis with promax (oblique) rotation was conducted, (2) a Schmid-Leiman transformation was attained, and (3) McDonald's omega and Cronbach's alpha were computed. Using SPSS, subscale findings were validated by conducting a principal axis factor analysis with promax rotation; strict parallels and Cronbach's alpha reliabilities were calculated. McDonald's omega and Cronbach's alpha were robust, with coefficients greater than 0.90; principal axis factor analysis with promax rotation revealed construct similarities with an overall general factor emerging from R. Further subjecting the PPDG to rigorous psychometric testing revealed stronger quantitative support of the overall general factor of directive guidance and subscales of instruction and feedback and goal-setting. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Quantifying the uncertainty of eddy covariance fluxes due to the use of different software packages and combinations of processing steps in two contrasting ecosystems

    Science.gov (United States)

    Mammarella, Ivan; Peltola, Olli; Nordbo, Annika; Järvi, Leena; Rannik, Üllar

    2016-10-01

    We have carried out an inter-comparison between EddyUH and EddyPro®, two public software packages for post-field processing of eddy covariance data. Datasets including carbon dioxide, methane and water vapour fluxes measured over 2 months at a wetland in southern Finland and carbon dioxide and water vapour fluxes measured over 3 months at an urban site in Helsinki were processed and analysed. The purpose was to estimate the flux uncertainty due to the use of different software packages and to evaluate the most critical processing steps, determining the largest deviations in the calculated fluxes. Turbulent fluxes calculated with a reference combination of processing steps were in good agreement, the systematic difference between the two software packages being up to 2.0 and 6.7 % for half-hour and cumulative sum values, respectively. The raw data preparation and processing steps were consistent between the software packages, and most of the deviations in the estimated fluxes were due to the flux corrections. Among the different calculation procedures analysed, the spectral correction had the biggest impact for closed-path latent heat fluxes, reaching a nocturnal median value of 15 % at the wetland site. We found up to a 43 % median value of deviation (with respect to the run with all corrections included) if the closed-path carbon dioxide flux is calculated without the dilution correction, while the methane fluxes were up to 10 % lower without both dilution and spectroscopic corrections. The Webb-Pearman-Leuning (WPL) and spectroscopic corrections were the most critical steps for open-path systems. However, we found also large spectral correction factors for the open-path methane fluxes, due to the sensor separation effect.

  9. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  10. Determinants of business model performance in software firms

    OpenAIRE

    Rajala, Risto

    2009-01-01

    The antecedents and consequences of business model design have gained increasing interest among information system (IS) scholars and business practitioners alike. Based on an extensive literature review and empirical research, this study investigates the factors that drive business model design and the performance effects generated by the different kinds of business models in software firms. The main research question is: “What are the determinants of business model performance in the softwar...

  11. Software for browsing sectioned images of a dog body and generating a 3D model.

    Science.gov (United States)

    Park, Jin Seo; Jung, Yong Wook

    2016-01-01

    The goals of this study were (1) to provide accessible and instructive browsing software for sectioned images and a portable document format (PDF) file that includes three-dimensional (3D) models of an entire dog body and (2) to develop techniques for segmentation and 3D modeling that would enable an investigator to perform these tasks without the aid of a computer engineer. To achieve these goals, relatively important or large structures in the sectioned images were outlined to generate segmented images. The sectioned and segmented images were then packaged into browsing software. In this software, structures in the sectioned images are shown in detail and in real color. After 3D models were made from the segmented images, the 3D models were exported into a PDF file. In this format, the 3D models could be manipulated freely. The browsing software and PDF file are available for study by students, for lecture for teachers, and for training for clinicians. These files will be helpful for anatomical study by and clinical training of veterinary students and clinicians. Furthermore, these techniques will be useful for researchers who study two-dimensional images and 3D models.

  12. Factors Affecting the Choice of Software Life Cycle Models in the Software Industry-An Empirical Study

    OpenAIRE

    Vandana Bhattacherjee; M. S. Neogi; Rupa Mahanti

    2012-01-01

    Problem statement: The aim of this study was to present the results of the survey conducted with software professionals in a few Indian software companies. Approach: The study initially presents an overview of the common software life cycle models used in the software development. Results and Conclusion: The survey results revealed that the level of understanding of the user requirements is the most important fact in the choice of the life cycle model used in the software project. Project Com...

  13. Thermo-mechanical model optimization of HB-LED packaging

    NARCIS (Netherlands)

    Yuan, C.A.; Erinc, M.; Gielen, A.W.J.; Waal, A. van der; Driel, W. van; Zhang, K.

    2011-01-01

    Lighting is an advancing phenomenon both on the technology and on the market level due to the rapid development of the solid state lighting technology. The efforts in improving the efficacy of high brightness LED's (HB-LED) have concentrated on the packaging architecture. Packaging plays a significa

  14. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  15. Aluminum Laminates in Beverage Packaging: Models and Experiences

    Directory of Open Access Journals (Sweden)

    Gabriella Bolzon

    2015-08-01

    Full Text Available Aluminum laminates are among the main components of beverage packaging. These layered material systems are coupled to paperboard plies except in the cap opening area, where the human force limit sets a requirement on the material properties to allow open-ability and the mechanical characteristics are of particular interest. Experimental investigations have been carried out on this composite and on its components by either traditional or full-field measurement techniques. The interpretation of the collected data has been supported by the simulation of the performed tests considering either a homogenized material model or the individual laminate layers. However, different results may be recovered from similar samples due to physical factors like the material processing route and the embedded defectiveness. In turn, the conclusions may vary depending on the model assumptions. This contribution focuses on the physical effects and on the modeling of the large localized deformation induced by material singularities. This topic is discussed at the light of some experimental results.

  16. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  17. Software Defect Prediction Models for Quality Improvement: A Literature Study

    Directory of Open Access Journals (Sweden)

    Mrinal Singh Rawat

    2012-09-01

    Full Text Available In spite of meticulous planning, well documentation and proper process control during software development, occurrences of certain defects are inevitable. These software defects may lead to degradation of the quality which might be the underlying cause of failure. In todays cutting edge competition its necessary to make conscious efforts to control and minimize defects in software engineering. However, these efforts cost money, time and resources. This paper identifies causative factors which in turn suggest the remedies to improve software quality and productivity. The paper also showcases on how the various defect prediction models are implemented resulting in reduced magnitude of defects.

  18. A bridge role metric model for nodes in software networks.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  19. An Overview of Mesoscale Modeling Software for Energetic Materials Research

    Science.gov (United States)

    2010-03-01

    RESearch on Soft Matter ....................11 2.6.1 Underlying Algorithms...Mesocale modeling software summary. Software Algorithms Applications/Properties MesoDyn Dynamic Density Field Soft matter , polymers, melts, blends...equations. Table 2. MesoDyn summary. Company/Institution Accelrys Applications 1. Soft matter , complex fluids, polymer melts and blends, surfactants

  20. Aligning the economic modeling of software reuse with reuse practices

    NARCIS (Netherlands)

    Postmus, D.; Meijler, 27696

    2008-01-01

    In contrast to current practices where software reuse is applied recursively and reusable assets are tailored trough parameterization or specialization, existing reuse economic models assume that (i) the cost of reusing a software asset depends on its size and (ii) reusable assets are developed from

  1. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  2. lumpR 2.0.0: an R package facilitating landscape discretisation for hillslope-based hydrological models

    Science.gov (United States)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2017-08-01

    The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.

  3. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    Science.gov (United States)

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society

  4. Singularity of Software Reliability Models LVLM and LVQM

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    According to the principle, “The failure data is the basis of software reliabilityanalysis”, we built a software reliability expert system (SRES) by adopting the artificialtechnology. By reasoning out the conclusion from the fitting results of failure data of asoftware project, the SRES can recommend users “the most suitable model”as a softwarereliability measurement model. We believe that the SRES can overcome the inconsistency inapplications of software reliability models well. We report investigation results of singularity and parameter estimation methods of models, LVLM and LVQM.

  5. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes

    Directory of Open Access Journals (Sweden)

    Steyerberg Ewout W

    2011-05-01

    Full Text Available Abstract Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI enrolled in eight Randomized Controlled Trials (RCTs and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4, Stata (GLLAMM, SAS (GLIMMIX and NLMIXED, MLwiN ([R]IGLS and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC, R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal models for the main study and when based on a relatively large number of level-1 (patient level data compared to the number of level-2 (hospital level data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in

  6. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  7. A Reference Model for Mobile Social Software for Learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2007-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model for mobile social software for learning. International Journal of Continuing Engineering Education and Life-Long Learning, 18(1), 118-138.

  8. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  9. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  10. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  11. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    Science.gov (United States)

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide.

  12. The SAM software system for modeling severe accidents at nuclear power plants equipped with VVER reactors on full-scale and analytic training simulators

    Science.gov (United States)

    Osadchaya, D. Yu.; Fuks, R. L.

    2014-04-01

    The architecture of the SAM software package intended for modeling beyond-design-basis accidents at nuclear power plants equipped with VVER reactors evolving into a severe stage with core melting and failure of the reactor pressure vessel is presented. By using the SAM software package it is possible to perform comprehensive modeling of the entire emergency process from the failure initiating event to the stage of severe accident involving meltdown of nuclear fuel, failure of the reactor pressure vessel, and escape of corium onto the concrete basement or into the corium catcher with retention of molten products in it.

  13. Model-Driven Software Evolution: A Research Agenda

    NARCIS (Netherlands)

    Van Deursen, A.; Visser, E.; Warmer, J.

    2007-01-01

    Software systems need to evolve, and systems built using model-driven approaches are no exception. What complicates model-driven engineering is that it requires multiple dimensions of evolution. In regular evolution, the modeling language is used to make the changes. In meta-model evolution, changes

  14. A Model for Crises Management in Software Projects

    Directory of Open Access Journals (Sweden)

    Mohammad Tarawneh

    2011-11-01

    Full Text Available Today software projects are important part into almost every business application. It is quality, efficiency and effectiveness of these applications will determine the failure or success of many business solutions. Consequently, businesses often find that they need to have a competitive and efficient advantage through the development and improve of software projects that help critical business activities. The quality of a software project is determined by the quality of the software development process. Improvements in the development process can lead to significant improvement in software quality. Based on the foregoing risks and problems which may be software engineering project faced, we try to shed light on the mechanism of dealing with crises in software engineering projects in this research. This research suggests a set of rules and guidelines that help software project mangers to prevent and dealing with software project crises Also a model was proposed; the proposed model showed a set of steps that must be implemented in case of crises emerging or before it happen. The crisis management starts understanding it first and then to prepare a careful review of her as she is looking for regions or aspects of the turmoil and failures. The next step is the classification of crisis, then the preparation or design a plan attitudinal or contingency plan, which must be implemented immediately upon the occurrence of crisis. Finally, the final element is the implementation of the program or plan established soon after the crisis and it should be noted here that the project team of software engineering that have been trained on the virtual models of various crises, which helps in the development of managed, skills, and also that you should avoid or ignore the failure to acknowledge a problem when Start or try to be underestimated or taken lightly.

  15. Model-based Methods of Classification: Using the mclust Software in Chemometrics

    Directory of Open Access Journals (Sweden)

    Chris Fraley

    2007-01-01

    Full Text Available Due to recent advances in methods and software for model-based clustering, and to the interpretability of the results, clustering procedures based on probability models are increasingly preferred over heuristic methods. The clustering process estimates a model for the data that allows for overlapping clusters, producing a probabilistic clustering that quantifies the uncertainty of observations belonging to components of the mixture. The resulting clustering model can also be used for some other important problems in multivariate analysis, including density estimation and discriminant analysis. Examples of the use of model-based clustering and classification techniques in chemometric studies include multivariate image analysis, magnetic resonance imaging, microarray image segmentation, statistical process control, and food authenticity. We review model-based clustering and related methods for density estimation and discriminant analysis, and show how the R package mclust can be applied in each instance.

  16. Development of an Open Source Image-Based Flow Modeling Software - SimVascular

    Science.gov (United States)

    Updegrove, Adam; Merkow, Jameson; Schiavazzi, Daniele; Wilson, Nathan; Marsden, Alison; Shadden, Shawn

    2014-11-01

    SimVascular (www.simvascular.org) is currently the only comprehensive software package that provides a complete pipeline from medical image data segmentation to patient specific blood flow simulation. This software and its derivatives have been used in hundreds of conference abstracts and peer-reviewed journal articles, as well as the foundation of medical startups. SimVascular was initially released in August 2007, yet major challenges and deterrents for new adopters were the requirement of licensing three expensive commercial libraries utilized by the software, a complicated build process, and a lack of documentation, support and organized maintenance. In the past year, the SimVascular team has made significant progress to integrate open source alternatives for the linear solver, solid modeling, and mesh generation commercial libraries required by the original public release. In addition, the build system, available distributions, and graphical user interface have been significantly enhanced. Finally, the software has been updated to enable users to directly run simulations using models and boundary condition values, included in the Vascular Model Repository (vascularmodel.org). In this presentation we will briefly overview the capabilities of the new SimVascular 2.0 release. National Science Foundation.

  17. The Adaptive Buffered Force QM/MM method in the CP2K and AMBER software packages

    CERN Document Server

    Mones, Letif; Götz, Andreas W; Laino, Teodoro; Walker, Ross C; Leimkuhler, Ben; Csányi, Gábor; Bernstein, Noam

    2014-01-01

    The implementation and validation of the adaptive buffered force QM/MM method in two popular packages, CP2K and AMBER are presented. The implementations build on the existing QM/MM functionality in each code, extending it to allow for redefinition of the QM and MM regions during the simulation and reducing QM-MM interface errors by discarding forces near the boundary according to the buffered force-mixing approach. New adaptive thermostats, needed by force-mixing methods, are also implemented. Different variants of the method are benchmarked by simulating the structure of bulk water, water autoprotolysis in the presence of zinc and dimethyl-phosphate hydrolysis using various semiempirical Hamiltonians and density functional theory as the QM model. It is shown that with suitable parameters, based on force convergence tests, the adaptive buffered-force QM/MM scheme can provide an accurate approximation of the structure in the dynamical QM region matching the corresponding fully QM simulations, as well as reprod...

  18. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  19. Analysis of Ecodesign Implementation and Solutions for Packaging Waste System by Using System Dynamics Modeling

    Science.gov (United States)

    Berzina, Alise; Dace, Elina; Bazbauers, Gatis

    2010-01-01

    This paper discusses the findings of a research project which explored the packaging waste management system in Latvia. The paper focuses on identifying how the policy mechanisms can promote ecodesign implementation and material efficiency improvement and therefore reduce the rate of packaging waste accumulation in landfill. The method used for analyzing the packaging waste management policies is system dynamics modeling. The main conclusion is that the existing legislative instruments can be used to create an effective policy for ecodesign implementation but substantially higher tax rates on packaging materials and waste disposal than the existing have to be applied.

  20. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.