WorldWideScience

Sample records for high-performance software package

  1. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  2. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  3. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  4. The Ettention software package.

    Science.gov (United States)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  6. Packaging of control system software

    International Nuclear Information System (INIS)

    Zagar, K.; Kobal, M.; Saje, N.; Zagar, A.; Sabjan, R.; Di Maio, F.; Stepanov, D.

    2012-01-01

    Control system software consists of several parts - the core of the control system, drivers for integration of devices, configuration for user interfaces, alarm system, etc. Once the software is developed and configured, it must be installed to computers where it runs. Usually, it is installed on an operating system whose services it needs, and also in some cases dynamically links with the libraries it provides. Operating system can be quite complex itself - for example, a typical Linux distribution consists of several thousand packages. To manage this complexity, we have decided to rely on Red Hat Package Management system (RPM) to package control system software, and also ensure it is properly installed (i.e., that dependencies are also installed, and that scripts are run after installation if any additional actions need to be performed). As dozens of RPM packages need to be prepared, we are reducing the amount of effort and improving consistency between packages through a Maven-based infrastructure that assists in packaging (e.g., automated generation of RPM SPEC files, including automated identification of dependencies). So far, we have used it to package EPICS, Control System Studio (CSS) and several device drivers. We perform extensive testing on Red Hat Enterprise Linux 5.5, but we have also verified that packaging works on CentOS and Scientific Linux. In this article, we describe in greater detail the systematic system of packaging we are using, and its particular application for the ITER CODAC Core System. (authors)

  7. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  8. MARS software package status

    International Nuclear Information System (INIS)

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  9. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Directory of Open Access Journals (Sweden)

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  10. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  11. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    Directory of Open Access Journals (Sweden)

    Sang-Kyu Jung

    Full Text Available Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  12. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  13. Intercomparison of PIXE spectrometry software packages

    International Nuclear Information System (INIS)

    2003-02-01

    During the year 2000, an exercise was organized to make a intercomparison of widely available software packages for analysis of particle induced X ray emission (PIXE) spectra. This TECDOC describes the method used in this intercomparison exercise and presents the results obtained. It also gives a general overview of the participating software packages. This includes basic information on their user interface, graphical presentation capabilities, physical phenomena taken in account, way of presenting results, etc. No recommendation for a particular software package or method for spectrum analysis is given. It is intended that the readers reach their own conclusions and make their own choices, according to their specific needs. This TECDOC will be useful to anyone involved in PIXE spectrum analysis. This TECDOC includes a companion CD with the complete set of test spectra used for intercomparison. The test spectra on this CD can be used to test any PIXE spectral analysis software package

  14. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  15. Intercomparison of alpha particle spectrometry software packages

    International Nuclear Information System (INIS)

    1999-08-01

    Software has reached an important level as the 'logical controller' at different levels, from a single instrument to an entire computer-controlled experiment. This is also the case for software packages in nuclear instruments and experiments. In particular, because of the range of applications of alpha-particle spectrometry, software packages in this field are often used. It is the aim of this intercomparison to test and describe the abilities of four such software packages. The main objectives of the intercomparison were the ability of the programs to determine the peak areas and the peak area uncertainties, and the statistical control and stability of reported results. In this report, the task, methods and results of the intercomparison are presented in order to asist the potential users of such software and to stimulate the development of even better alpha-particle spectrum analysis software

  16. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Science.gov (United States)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  17. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  18. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Software package as an information center product

    International Nuclear Information System (INIS)

    Butler, M.K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables

  20. Parallelization of an existing high energy physics event reconstruction software package

    International Nuclear Information System (INIS)

    Schiefer, R.; Francis, D.

    1996-01-01

    Software parallelization allows an efficient use of available computing power to increase the performance of applications. In a case study the authors have investigated the parallelization of high energy physics event reconstruction software in terms of costs (effort, computing resource requirements), benefits (performance increase) and the feasibility of a systematic parallelization approach. Guidelines facilitating a parallel implementation are proposed for future software development

  1. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  2. Vertical bone measurements from cone beam computed tomography images using different software packages

    International Nuclear Information System (INIS)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  3. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  4. Improving package structure of object-oriented software using multi-objective optimization and weighted class connections

    Directory of Open Access Journals (Sweden)

    Amarjeet

    2017-07-01

    Full Text Available The software maintenance activities performed without following the original design decisions about the package structure usually deteriorate the quality of software modularization, leading to decay of the quality of the system. One of the main reasons for such structural deterioration is inappropriate grouping of source code classes in software packages. To improve such grouping/modular-structure, previous researchers formulated the software remodularization problem as an optimization problem and solved it using search-based meta-heuristic techniques. These optimization approaches aimed at improving the quality metrics values of the structure without considering the original package design decisions, often resulting into a totally new software modularization. The entirely changed software modularization becomes costly to realize as well as difficult to understand for the developers/maintainers. To alleviate this issue, we propose a multi-objective optimization approach to improve the modularization quality of an object-oriented system with minimum possible movement of classes between existing packages of original software modularization. The optimization is performed using NSGA-II, a widely-accepted multi-objective evolutionary algorithm. In order to ensure minimum modification of original package structure, a new approach of computing class relations using weighted strengths has been proposed here. The weights of relations among different classes are computed on the basis of the original package structure. A new objective function has been formulated using these weighted class relations. This objective function drives the optimization process toward better modularization quality simultaneously ensuring preservation of original structure. To evaluate the results of the proposed approach, a series of experiments are conducted over four real-worlds and two random software applications. The experimental results clearly indicate the effectiveness

  5. PCG: A software package for the iterative solution of linear systems on scalar, vector and parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, W. [Los Alamos National Lab., NM (United States); Carey, G.F. [Univ. of Texas, Austin, TX (United States)

    1994-12-31

    A great need exists for high performance numerical software libraries transportable across parallel machines. This talk concerns the PCG package, which solves systems of linear equations by iterative methods on parallel computers. The features of the package are discussed, as well as techniques used to obtain high performance as well as transportability across architectures. Representative numerical results are presented for several machines including the Connection Machine CM-5, Intel Paragon and Cray T3D parallel computers.

  6. PINT, A Modern Software Package for Pulsar Timing

    Science.gov (United States)

    Luo, Jing; Ransom, Scott M.; Demorest, Paul; Ray, Paul S.; Stovall, Kevin; Jenet, Fredrick; Ellis, Justin; van Haasteren, Rutger; Bachetti, Matteo; NANOGrav PINT developer team

    2018-01-01

    Pulsar timing, first developed decades ago, has provided an extremely wide range of knowledge about our universe. It has been responsible for many important discoveries, such as the discovery of the first exoplanet and the orbital period decay of double neutron star systems. Currently pulsar timing is the leading technique for detecting low frequency (about 10^-9 Hertz) gravitational waves (GW) using an array of pulsars as the detectors. To achieve this goal, high precision pulsar timing data, at about nanoseconds level, is required. Most high precision pulsar timing data are analyzed using the widely adopted software TEMPO/TEMPO2. But for a robust and believable GW detection, it is important to have independent software that can cross-check the result. In this poster we present the new generation pulsar timing software PINT. This package will provide a robust system to cross check high-precision timing results, completely independent of TEMPO and TEMPO2. In addition, PINT is designed to be a package that is easy to extend and modify, through use of flexible code architecture and a modern programming language, Python, with modern technology and libraries.

  7. Introduction to Software Packages. [Final Report.

    Science.gov (United States)

    Frankel, Sheila, Ed.; And Others

    This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…

  8. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov (United States)

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  9. A User-Friendly Software Package for HIFU Simulation

    Science.gov (United States)

    Soneson, Joshua E.

    2009-04-01

    A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.

  10. PIV Data Validation Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  11. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  12. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  13. Nested Cohort - R software package

    Science.gov (United States)

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  14. Hazardous materials package performance regulations

    International Nuclear Information System (INIS)

    Russell, N.A.; Glass, R.E.; McClure, J.D.; Finley, N.C.

    1993-01-01

    Two regulatory philosophies, one based on 'specification' packaging standards and the other based on 'performance' packaging standards, currently define the hazmat packaging certification process. A main concern when setting performance standards is determining the appropriate standards necessary to assure adequate public protection. This paper discusses a Hazmat Packaging Performance Evaluation (HPPE) project being conducted at Sandia National Laboratories for the U.S. Department of Transportation Research and Special Programs Administration. In this project, the current bulk packagings (larger than 2000 gallons) for transporting Materials Extremely Toxic By Inhalation (METBI) are being evaluated and performance standards will be recommended. A computer software system, HazCon, has been developed which can calculate the dispersion of dense, neutral, and buoyant gases. HazCon also has a database of thermodynamic and toxicity data for the METBI materials, a user-friendly menu-driven format for creating input data sets for calculating dispersion of the METBI in the event of an accidental release, and a link between the METBI database and the dense gas dispersion code (which requires thermodynamic properties). The primary output of HazCon is a listing of mass concentrations of the released material at distances downwind from the release point. (J.P.N.)

  15. Software packages for food engineering needs

    OpenAIRE

    Abakarov, Alik

    2011-01-01

    The graphic user interface (GUI) software packages “ANNEKs” and “OPT-PROx” are developed to meet food engineering needs. “OPT-RROx” (OPTimal PROfile) is software developed to carry out thermal food processing optimization based on the variable retort temperature processing and global optimization technique. “ANNEKs” (Artificial Neural Network Enzyme Kinetics) is software designed for determining the kinetics of enzyme hydrolysis of protein at different initial reaction parameters based on the...

  16. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  17. Automated packaging platform for low-cost high-performance optical components manufacturing

    Science.gov (United States)

    Ku, Robert T.

    2004-05-01

    Delivering high performance integrated optical components at low cost is critical to the continuing recovery and growth of the optical communications industry. In today's market, network equipment vendors need to provide their customers with new solutions that reduce operating expenses and enable new revenue generating IP services. They must depend on the availability of highly integrated optical modules exhibiting high performance, small package size, low power consumption, and most importantly, low cost. The cost of typical optical system hardware is dominated by linecards that are in turn cost-dominated by transmitters and receivers or transceivers and transponders. Cost effective packaging of optical components in these small size modules is becoming the biggest challenge to be addressed. For many traditional component suppliers in our industry, the combination of small size, high performance, and low cost appears to be in conflict and not feasible with conventional product design concepts and labor intensive manual assembly and test. With the advent of photonic integration, there are a variety of materials, optics, substrates, active/passive devices, and mechanical/RF piece parts to manage in manufacturing to achieve high performance at low cost. The use of automation has been demonstrated to surpass manual operation in cost (even with very low labor cost) as well as product uniformity and quality. In this paper, we will discuss the value of using an automated packaging platform.for the assembly and test of high performance active components, such as 2.5Gb/s and 10 Gb/s sources and receivers. Low cost, high performance manufacturing can best be achieved by leveraging a flexible packaging platform to address a multitude of laser and detector devices, integration of electronics and handle various package bodies and fiber configurations. This paper describes the operation and results of working robotic assemblers in the manufacture of a Laser Optical Subassembly

  18. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner...... approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  19. Strategic Business-IT alignment of application software packages: Bridging the Information Technology gap

    Directory of Open Access Journals (Sweden)

    Wandi Kruger

    2012-09-01

    Full Text Available An application software package implementation is a complex endeavour, and as such it requires the proper understanding, evaluation and redefining of the current business processes to ensure that the implementation delivers on the objectives set at the start of the project. Numerous factors exist that may contribute to the unsuccessful implementation of application software packages. However, the most significant contributor to the failure of an application software package implementation lies in the misalignment of the organisation’s business processes with the functionality of the application software package. Misalignment is attributed to a gap that exists between the business processes of an organisation and what functionality the application software package has to offer to translate the business processes of an organisation into digital form when implementing and configuring an application software package. This gap is commonly referred to as the information technology (IT gap. This study proposes to define and discuss the IT gap. Furthermore this study will make recommendations for aligning the business processes with the functionality of the application software package (addressing the IT gap. The end result of adopting these recommendations will be more successful application software package implementations.

  20. ATK-ForceField: a new generation molecular dynamics software package

    Science.gov (United States)

    Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt

    2017-12-01

    ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.

  1. Comparison of PV system design software packages for urban applications

    Energy Technology Data Exchange (ETDEWEB)

    Gharakhani Siraki, Arbi; Pillay, Pragasen

    2010-09-15

    A large number of software packages are available for solar resource evaluation and PV system design. However, few of them are suitable for urban applications. In this paper a comparison has been made between two specifically designed solar tools known as the Ecotect 2010 and the PVsyst 5.05. Conclusions have been made for proper use of these packages based on their specifications and privileges. Moreover, the calculations have been repeated with HOMER software package (which is a generic tool) for the same location. The results suggest that a generic solar software tool should not be used for an urban application.

  2. Accuracy of Giovanni and Marksim Software Packages for ...

    African Journals Online (AJOL)

    Accuracy of Giovanni and Marksim Software Packages for Generating Daily Rainfall Data in ... using Giovanni software over Marksim, for areas receiving bimodal rainfall regimes similar to ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  3. Article I. Multi-platform Automated Software Building and Packaging

    International Nuclear Information System (INIS)

    Rodriguez, A Abad; Gomes Gouveia, V E; Meneses, D; Capannini, F; Aimar, A; Di Meglio, A

    2012-01-01

    One of the major goals of the EMI (European Middleware Initiative) project is the integration of several components of the pre-existing middleware (ARC, gLite, UNICORE and dCache) into a single consistent set of packages with uniform distributions and repositories. Those individual middleware projects have been developed in the last decade by tens of development teams and before EMI were all built and tested using different tools and dedicated services. The software, millions of lines of code, is written in several programming languages and supports multiple platforms. Therefore a viable solution ought to be able to build and test applications on multiple programming languages using common dependencies on all selected platforms. It should, in addition, package the resulting software in formats compatible with the popular Linux distributions, such as Fedora and Debian, and store them in repositories from which all EMI software can be accessed and installed in a uniform way. Despite this highly heterogeneous initial situation, a single common solution, with the aim of quickly automating the integration of the middleware products, had to be selected and implemented in a few months after the beginning of the EMI project. Because of the previous knowledge and the short time available in which to provide this common solution, the ETICS service, where the gLite middleware was already built for years, was selected. This contribution describes how the team in charge of providing a common EMI build and packaging infrastructure to the whole project has developed a homogeneous solution for releasing and packaging the EMI components from the initial set of tools used by the earlier middleware projects. An important element of the presentation is the developers experience and feedback on converging on ETICS and on the on-going work in order to finally add more widely used and supported build and packaging solutions of the Linux platforms

  4. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  5. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  6. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  7. Adoption of open source digital library software packages: a survey

    OpenAIRE

    Jose, Sanjo

    2007-01-01

    Open source digital library packages are gaining popularity nowadays. To build a digital library under economical conditions open source software is preferable. This paper tries to identify the extent of adoption of open source digital library software packages in various organizations through an online survey. It lays down the findings from the survey.

  8. Evaluation of three state-of-the-art metabolite prediction software packages (Meteor, MetaSite, and StarDrop) through independent and synergistic use.

    Science.gov (United States)

    T'jollyn, H; Boussery, K; Mortishire-Smith, R J; Coe, K; De Boeck, B; Van Bocxlaer, J F; Mannens, G

    2011-11-01

    The aim of this study was to evaluate three different metabolite prediction software packages (Meteor, MetaSite, and StarDrop) with respect to their ability to predict loci of metabolism and suggest relative proportions of metabolites. A chemically diverse test set of 22 compounds, for which in vivo human mass balance studies and metabolic schemes were available, was used as basis for the evaluation. Each software package was provided with structures of the parent compounds, and predicted metabolites were compared with experimentally determined human metabolites. The evaluation consisted of two parts. First, different settings within each software package were investigated and the software was evaluated using those settings determined to give the best prediction. Second, the three different packages were combined using the optimized settings to see whether a synergistic effect concerning the overall metabolism prediction could be established. The performance of the software was scored for both sensitivity and precision, taking into account the capabilities/limitations of the particular software. Varying results were obtained for the individual packages. Meteor showed a general tendency toward overprediction, and this led to a relatively low precision (∼35%) but high sensitivity (∼70%). MetaSite and StarDrop both exhibited a sensitivity and precision of ∼50%. By combining predictions obtained with the different packages, we found that increased precision can be obtained. We conclude that the state-of-the-art individual metabolite prediction software has many advantageous features but needs refinement to obtain acceptable prediction profiles. Synergistic use of different software packages could prove useful.

  9. Human-machine interface software package

    International Nuclear Information System (INIS)

    Liu, D.K.; Zhang, C.Z.

    1992-01-01

    The Man-Machine Interface software Package (MMISP) is designed to configure the console software of PLS 60 Mev LINAC control system. The control system of PLS 60 Mev LINAC is a distributed control system which includes the main computer (Intel 310) four local station, and two sets of industrial level console computer. The MMISP provides the operator with the display page editor, various I/O configuration such as digital signals In/Out, analog signal In/Out, waveform TV graphic display, and interactive with operator through graphic picture display, voice explanation, and touch panel. This paper describes its function and application. (author)

  10. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  11. Radioactive material packaging performance testing

    International Nuclear Information System (INIS)

    Romano, T.; Cruse, J.M.

    1991-02-01

    To provide uniform packaging of hazardous materials on an international level, the United Nations has developed packaging recommendations that have been implemented worldwide. The United Nations packaging recommendations are performance oriented, allowing for a wide variety of package materials and systems. As a result of this international standard, efforts in the United States are being directed toward use of performance-oriented packaging and elimination of specification (designed) packaging. This presentation will focus on trends, design evaluation, and performance testing of radioactive material packaging. The impacts of US Department of Transportation Dockets HM-181 and HM-169A on specification and low-specific activity radioactive material packaging requirements are briefly discussed. The US Department of Energy's program for evaluating radioactive material packings per US Department of Transportation Specification 7A Type A requirements, is used as the basis for discussing low-activity packaging performance test requirements. High-activity package testing requirements are presented with examples of testing performed at the Hanford Site that is operated by Westinghouse Hanford Company for the US Department of Energy. 5 refs., 2 tabs

  12. Software Package STATISTICA and Educational Process

    Directory of Open Access Journals (Sweden)

    Demidova Liliya

    2016-01-01

    Full Text Available The paper describes the main aspects of application of the software package STATISTICA in the educational process. Technologies of data mining which can be useful for students researches have been considered. The main tools of these technologies have been discussed.

  13. ACEMAN (II): a PDP-11 software package for acoustic emission analysis

    International Nuclear Information System (INIS)

    Tobias, A.

    1976-01-01

    A powerful, but easy-to-use, software package (ACEMAN) for acoustic emission analysis has been developed at Berkeley Nuclear Laboratories. The system is based on a PDP-11 minicomputer with 24 K of memory, an RK05 DISK Drive and a Tektronix 4010 Graphics terminal. The operation of the system is described in detail in terms of the functions performed in response to the various command mnemonics. The ACEMAN software package offers many useful facilities not found on other acoustic emission monitoring systems. Its main features, many of which are unique, are summarised. The ACEMAN system automatically handles arrays of up to 12 sensors in real-time operation during which data are acquired, analysed, stored on the computer disk for future analysis and displayed on the terminal if required. (author)

  14. International Inventory of Software Packages in the Information Field.

    Science.gov (United States)

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  15. An Ada Linear-Algebra Software Package Modeled After HAL/S

    Science.gov (United States)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  16. Western aeronautical test range real-time graphics software package MAGIC

    Science.gov (United States)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  17. Implementation of a high performance parallel finite element micromagnetics package

    International Nuclear Information System (INIS)

    Scholz, W.; Suess, D.; Dittrich, R.; Schrefl, T.; Tsiantos, V.; Forster, H.; Fidler, J.

    2004-01-01

    A new high performance scalable parallel finite element micromagnetics package has been implemented. It includes solvers for static energy minimization, time integration of the Landau-Lifshitz-Gilbert equation, and the nudged elastic band method

  18. Software package for analysis of completely randomized block design

    African Journals Online (AJOL)

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  19. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  20. LipiDex: An Integrated Software Package for High-Confidence Lipid Identification.

    Science.gov (United States)

    Hutchins, Paul D; Russell, Jason D; Coon, Joshua J

    2018-04-17

    State-of-the-art proteomics software routinely quantifies thousands of peptides per experiment with minimal need for manual validation or processing of data. For the emerging field of discovery lipidomics via liquid chromatography-tandem mass spectrometry (LC-MS/MS), comparably mature informatics tools do not exist. Here, we introduce LipiDex, a freely available software suite that unifies and automates all stages of lipid identification, reducing hands-on processing time from hours to minutes for even the most expansive datasets. LipiDex utilizes flexible in silico fragmentation templates and lipid-optimized MS/MS spectral matching routines to confidently identify and track hundreds of lipid species and unknown compounds from diverse sample matrices. Unique spectral and chromatographic peak purity algorithms accurately quantify co-isolation and co-elution of isobaric lipids, generating identifications that match the structural resolution afforded by the LC-MS/MS experiment. During final data filtering, ionization artifacts are removed to significantly reduce dataset redundancy. LipiDex interfaces with several LC-MS/MS software packages, enabling robust lipid identification to be readily incorporated into pre-existing data workflows. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Waste package performance assessment

    International Nuclear Information System (INIS)

    Lester, D.H.

    1981-01-01

    This paper describes work undertaken to assess the life-expectancy and post-failure nuclide release behavior of high-level and waste packages in a geologic repository. The work involved integrating models of individual phenomena (such as heat transfer, corrosion, package deformation, and nuclide transport) and using existing data to make estimates of post-emplacement behavior of waste packages. A package performance assessment code was developed to predict time to package failure in a flooded repository and subsequent transport of nuclides out of the leaking package. The model has been used to evaluate preliminary package designs. The results indicate, that within the limitation of model assumptions and data base, packages lasting a few hundreds of years could be developed. Very long lived packages may be possible but more comprehensive data are needed to confirm this

  2. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    Science.gov (United States)

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  3. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  4. VIPEX (Vital-area Identification Package EXpert) Software Verification and Validation

    International Nuclear Information System (INIS)

    Jung, Woo Sik; Suh, Jae Seung

    2010-06-01

    The purposes of this report are (1) to perform a Verification and Validation (V and V) test for the VIPEX(Vital-area Identification Package EXpert) software and (2) to improve a software quality through the V and V test. The VIPEX was developed in Korea Atomic Energy Research Institute (KAERI) for the Vital Area Identification (VAI) of nuclear power plants. The version of the VIPEX which was distributed is 3.2.0.0. The VIPEX was revised based on the first V and V test and the second V and V test was performed. We have performed the following tasks for the V and V test on Windows XP and VISTA operating systems: Ο Testing basic functions including fault tree editing Ο Testing all kind of functions Ο Research for update from Visual BASIC 6.0 to Visual BASIC 2008

  5. SPADE - software package to aid diffraction experiments

    International Nuclear Information System (INIS)

    Farren, J.; Giltrap, J.W.

    1978-10-01

    A software package is described which enables the DEC PDP-11/03 microcomputer to execute several different X-ray diffraction experiments and other similar experiments where stepper motors are driven and data is gathered and processed in real time. (author)

  6. Software package evaluation for the TJ-II Data Acquisition System

    International Nuclear Information System (INIS)

    Cremy, C.; Sanchez, E.; Portas, A.; Vega, J.

    1996-01-01

    The TJ-II Data Acquisition System (DAS) has to provide a user interface which will allow setup for sampling channels, discharge signal visualization and reduce data processing, all in run time. On the other hand, the DAS will provide a high level software capability for signal analysis, processing and data visualization either in run time or off line. A set of software packages including Builder Xcessory, X-designer, llog Builder, Toolmaster, AVS 5, AVS/Express, PV-WAVE and Iris Explorer, have been evaluated by the Data Acquisition Group of the Fusion Division. the software evaluation, resumed in this paper, has resulted in a global solution being found which meets all of the DAS requirements. (Author)

  7. IDES: Interactive Data Entry System: a generalized data acquisition software package

    International Nuclear Information System (INIS)

    Gasser, S.B.

    1980-04-01

    The Interactive Data Entry System (IDES) is a software package which greatly assists in designing and storing forms to be used for the directed acquisition of data. Objective of this package is to provide a viable man/machine interface to any comprehensive data base. This report provides a technical description of the software and can be used as a user's manual

  8. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    Science.gov (United States)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  9. Evaluation of Different Software Packages in Flow Modeling under Bridge Structures

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Dastorani

    2007-01-01

    Full Text Available This study is an independent and a comparative research concerning the accuracy, capability and suitability of three well-known packages ofISIS, MIKE11 and HEC-RAS as hydraulic river modeling software packages for modeling the flow through bridges. The research project was designed to assess the ability of each software package to model the flow through bridge structures. It was carried out using the data taken from experiments completed by a 22-meter laboratory flume at theUniversityofBirmingham. The flume has a compound cross section containing a main channel and two flood plains on either side. For this study a smooth main channel and a smooth floodplain have been assumed. Two types of bridges are modeled in this research; a multiple opening semi-circular arch bridge and a single opening straight deck bridge. For each bridge, two different simulations were carried out using two different upstream boundaries as low flow and high flow simulations. According to the results, all three packages were able to model arch and US BPR bridges but in some cases they presented different results. The highest water elevation upstream the bridge (maximum afflux was the main parameter to be compared to the measured values.ISISand HEC-RAS (especially HEC-RAS seem to be more efficient to model arch bridge. However, in some cases, MIKE 11 produced considerably higher results than those of the other two packages. To model USBPR bridge, all three packages produced reasonable results. However, the results by HEC-RAS are the best when the outputs are compared to the experimental data.

  10. ANALYSIS OF CELLULAR REACTION TO IFN-γ STIMULATION BY A SOFTWARE PACKAGE GeneExpressionAnalyser

    Directory of Open Access Journals (Sweden)

    A. V. Saetchnikov

    2014-01-01

    Full Text Available The software package GeneExpressionAnalyser for analysis of the DNA microarray experi-mental data has been developed. The algorithms of data analysis, differentially expressed genes and biological functions of the cell are described. The efficiency of the developed package is tested on the published experimental data devoted to the time-course research of the changes in the human cell un-der the influence of IFN-γ on melanoma. The developed software has a number of advantages over the existing software: it is free, has a simple and intuitive graphical interface, allows to analyze different types of DNA microarrays, contains a set of methods for complete data analysis and performs effec-tive gene annotation for a selected list of genes.

  11. INSPECT: A graphical user interface software package for IDARC-2D

    Science.gov (United States)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  12. Conference on High Performance Software for Nonlinear Optimization

    CERN Document Server

    Murli, Almerico; Pardalos, Panos; Toraldo, Gerardo

    1998-01-01

    This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec­ tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa­ tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical...

  13. Comparison of four software packages applied to a scattering problem

    DEFF Research Database (Denmark)

    Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren

    1999-01-01

    We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...

  14. Radioactive material packaging performance testing

    International Nuclear Information System (INIS)

    Romano, T.

    1992-06-01

    In an effort to provide uniform packaging of hazardous material on an international level, recommendations for the transport of dangerous goods have been developed by the United Nations. These recommendations are performance oriented and contrast with a large number of packaging specifications in the US Department of Transportation's hazard materials regulations. This dual system presents problems when international shipments enter the US Department of Transportation's system. Faced with the question of continuing a dual system or aligning with the international system, the Research and Special Programs Administration of the US Department of Transportation responded with Docket HM-181. This began the transition toward the international transportation system. Following close behind is Docket HM-169A, which addressed low specific activity radioactive material packaging. This paper will discuss the differences between performance-oriented and specification packaging, the transition toward performance-oriented packaging by the US Department of Transportation, and performance-oriented testing of radioactive material packaging by Westinghouse Hanford Company. Dockets HM-181 and HM-169A will be discussed along with Type A (low activity) and Type B (high activity) radioactive material packaging evaluations

  15. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    Science.gov (United States)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  16. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  17. InterFace: A software package for face image warping, averaging, and principal components analysis.

    Science.gov (United States)

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  18. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  19. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    Science.gov (United States)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  20. Consys Linear Control System Design Software Package

    International Nuclear Information System (INIS)

    Diamantidis, Z.

    1987-01-01

    This package is created in order to help engineers, researchers, students and all who work on linear control systems. The software includes all time and frequency domain analysises, spectral analysises and networks, active filters and regulators design aids. The programmes are written on Hewlett Packard computer in Basic 4.0

  1. Decal Electronics: Printable Packaged with 3D Printing High-Performance Flexible CMOS Electronic Systems

    KAUST Repository

    Sevilla, Galo T.; Cordero, Marlon D.; Nassar, Joanna M.; Hanna, Amir; Kutbee, Arwa T.; Carreno, Armando Arpys Arevalo; Hussain, Muhammad Mustafa

    2016-01-01

    High-performance complementary metal oxide semiconductor electronics are flexed, packaged using 3D printing as decal electronics, and then printed in roll-to-roll fashion for highly manufacturable printed flexible high-performance electronic systems.

  2. Decal Electronics: Printable Packaged with 3D Printing High-Performance Flexible CMOS Electronic Systems

    KAUST Repository

    Sevilla, Galo T.

    2016-10-14

    High-performance complementary metal oxide semiconductor electronics are flexed, packaged using 3D printing as decal electronics, and then printed in roll-to-roll fashion for highly manufacturable printed flexible high-performance electronic systems.

  3. In-field inspection support software: A status report on the Common Inspection On-site Software Package (CIOSP) project

    International Nuclear Information System (INIS)

    Novatchev, Dimitre; Titov, Pavel; Siradjov, Bakhtiiar; Vlad, Ioan; Xiao Jing

    2001-01-01

    Full text: IAEA has invested much thought and effort into developing software that can assist inspectors during their inspection work. Experience with such applications has been steadily growing and IAEA has recently commissioned a next-generation software package. This kind of software accommodates inspection tasks that can vary substantially in function depending on the type of installation being inspected as well as ensures that the resulting software package has a wide range of usability and can preclude excessive development of plant-specific applications. The Common Inspection On-site Software Package is being developed in the Department of Safeguards to address the limitations of the existing software and to expand its coverage of the inspection process. CIOSP is 'common' in that it is aimed at providing support for as many facilities as possible with the minimum re-configuration. At the same time it has to cater to varying needs of individual facilities, different instrumentation and verification methods used. A component-based approach was taken to successfully tackle the challenges that the development of this software presented. CIOSP consists of the following major components: A framework into which individual plug-ins supporting various inspection activities can integrate at run-time; A central data store containing all facility configuration data and all data collected during inspections; A local data store, which resides on the inspector's computer, where the current inspection's data is stored; A set of services used by all plug-ins (i.e. data transformation, authentication, replication services etc.). This architecture allows for incremental development and extension of the software with plug-ins that support individual inspection activities. The core set of components along with the framework, the Inventory Verification, Book Examination and Records and Reports Comparison plug-ins have been developed. The development of the Short Notice Random

  4. Distributed control software of high-performance control-loop algorithm

    CERN Document Server

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  5. Development of 'Enhance reconstruction package' software for whole-body PET

    International Nuclear Information System (INIS)

    Mizuta, Tetsuro; Imanishi, Tatsuru; Ishikawa, Akihiro

    2011-01-01

    We have developed 'Enhance Reconstruction Package' Software for our whole-body positron emission tomography (PET) Eminence series. This package improves image quality and streamlines the workflow in clinical PET and PET/CT studies. The present paper describes an outline of the applications for data collection, normalization, etc. and also reports some PET images obtained by the software. The signal to noise ratio was optimized in the phantom study, leading to the improvement in image quality. The real time display tool and the remote control tool would make a contribution to enhancement in operability in the routine workflow. (author)

  6. Three dimensional field computation software package DE3D and its applications

    International Nuclear Information System (INIS)

    Fan Mingwu; Zhang Tianjue; Yan Weili

    1992-07-01

    A software package, DE3D that can be run on PC for three dimensional electrostatic and magnetostatic field analysis has been developed in CIAE (China Institute of Atomic Energy). Two scalar potential method and special numerical techniques have made the code with high precision. It can be used for electrostatic and magnetostatic fields computations with complex boundary conditions. In the most cases, the result accuracy is better than 1% comparing with the measured. In some situations, the results are more acceptable than the other codes because some tricks are used for the current integral. Typical examples, design of a cyclotron magnet and magnetic elements on its beam transport line, given in the paper show how the program helps the designer to improve the design of the product. The software package could bring advantages to the producers and designers

  7. Data Packages for the Hanford Immobilized Low-Activity Tank Waste Performance Assessment: 2001 Version

    International Nuclear Information System (INIS)

    MANN, F.M.

    2000-01-01

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided

  8. A Characteristics Approach to the Evaluation of Economics Software Packages.

    Science.gov (United States)

    Lumsden, Keith; Scott, Alex

    1988-01-01

    Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)

  9. Maximize Your Investment 10 Key Strategies for Effective Packaged Software Implementations

    CERN Document Server

    Beaubouef, Grady Brett

    2009-01-01

    This is a handbook covering ten principles for packaged software implementations that project managers, business owners, and IT developers should pay attention to. The book also has practical real-world coverage including a sample agenda for conducting business solution modeling, customer case studies, and a road map to implement guiding principles. This book is aimed at enterprise architects, development leads, project managers, business systems analysts, business systems owners, and anyone who wants to implement packaged software effectively. If you are a customer looking to implement COTS s

  10. Development of a software package for solid-angle calculations using the Monte Carlo method

    International Nuclear Information System (INIS)

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-01-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C ++ , has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4. -- Highlights: • This software package (SAC) can give accurate solid-angle values. • SAC calculate solid angles using the Monte Carlo method and it has higher computation speed than Geant4. • A simple but effective variance reduction technique which was put forward by the authors has been applied in SAC. • A visualization function and a graphical user interface are also integrated in SAC

  11. Global review of open access risk assessment software packages valid for global or continental scale analysis

    Science.gov (United States)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user

  12. Comparison of four software packages for CT lung volumetry in healthy individuals

    Energy Technology Data Exchange (ETDEWEB)

    Nemec, Stefan F. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Molinari, Francesco [Centre Hospitalier Regional Universitaire de Lille, Department of Radiology, Lille (France); Dufresne, Valerie [CHU de Charleroi - Hopital Vesale, Pneumologie, Montigny-le-Tilleul (Belgium); Gosset, Natacha [CHU Tivoli, Service d' Imagerie Medicale, La Louviere (Belgium); Silva, Mario; Bankier, Alexander A. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States)

    2015-06-01

    To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. (orig.)

  13. Software package for modeling spin-orbit motion in storage rings

    Science.gov (United States)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  14. Hazardous materials package performance regulations

    International Nuclear Information System (INIS)

    Russell, N.A.; Glass, R.E.; McClure, J.D.; Finley, N.C.

    1992-01-01

    The hazardous materials (hazmat) packaging development and certification process is currently defined by two different regulatory philosophies, one based on specification packagings and the other based on performance standards. With specification packagings, a packaging is constructed according to an agreed set of design specifications. In contrast, performance standards do not specify the packaging design; they specify performance standards that a packaging design must be able to pass before it can be certified for transport. The packaging can be designed according to individual needs as long as it meets these performance standards. Performance standards have been used nationally and internationally for about 40 years to certify radioactive materials (RAM) packagings. It is reasonable to state that for RAM transport, performance specifications have maintained transport safety. A committee of United Nation's experts recommended the performance standard philosophy as the preferred regulation method for hazmat packaging. Performance standards for hazmat packagings smaller than 118 gallons have been adopted in 49CFR178. Packagings for materials that are classified as toxic-by-inhalation must comply with the performance standards by October 1, 1993, and packagings for all other classes of hazardous materials covered must comply by October 1, 1996. For packages containing bulk (in excess of 188 gallons) quantities of materials that are extremely toxic by inhalation, there currently are no performance requirements. This paper discusses a Hazmat Packaging Performance Evaluation (HPPE) project to look at the subset of bulk packagings that are larger than 2000 gallons. The objectives of this project are the evaluate current hazmat specification packagings and develop supporting documentation for determining performance requirements for packagings in excess of 2000 gallons that transport hazardous materials that have been classified as extremely toxic by inhalation (METBI)

  15. Description of the IV + V System Software Package.

    Science.gov (United States)

    Microcomputers for Information Management: An International Journal for Library and Information Services, 1984

    1984-01-01

    Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…

  16. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  17. Software Package for Optics Measurement and Correction in the LHC

    CERN Document Server

    Aiba, M; Tomas, R; Vanbavinckhove, G

    2010-01-01

    A software package has been developed for the LHC on-line optics measurement and correction. This package includes several different algorithms to measure phase advance, beta functions, dispersion, coupling parameters and even some non-linear terms. A Graphical User Interface provides visualization tools to compare measurements to model predictions, fit analytical formula, localize error sources and compute and send corrections to the hardware.

  18. An Assessment of the Library Application Software Packages in ...

    African Journals Online (AJOL)

    Journal Home > Vol 7, No 2 (2007) > ... the study examined the adopted softwares' security, compatibility/capabilities, ... The study found that most application packages available in the Nigerian automation market place are effective since they ...

  19. The experimental modification of a computer software package for ...

    African Journals Online (AJOL)

    The experimental modification of a computer software package for graphing algebraic functions. ... No Abstract Available South African Journal of Education Vol.25(2) 2005: 61-68. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  20. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    Science.gov (United States)

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  1. Software package for modeling spin–orbit motion in storage rings

    Energy Technology Data Exchange (ETDEWEB)

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de [St. Petersburg State University (Russian Federation)

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{sup 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.

  2. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  3. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  4. High-performance packaging for monolithic microwave and millimeter-wave integrated circuits

    Science.gov (United States)

    Shalkhauser, K. A.; Li, K.; Shih, Y. C.

    1992-01-01

    Packaging schemes are developed that provide low-loss, hermetic enclosure for enhanced monolithic microwave and millimeter-wave integrated circuits. These package schemes are based on a fused quartz substrate material offering improved RF performance through 44 GHz. The small size and weight of the packages make them useful for a number of applications, including phased array antenna systems. As part of the packaging effort, a test fixture was developed to interface the single chip packages to conventional laboratory instrumentation for characterization of the packaged devices.

  5. SeDA: A software package for the statistical analysis of the instrument drift

    International Nuclear Information System (INIS)

    Lee, H. J.; Jang, S. C.; Lim, T. J.

    2006-01-01

    The setpoints for safety-related equipment are affected by many sources of an uncertainty. ANSI/ISA-S67.04.01-2000 [1] and ISA-RP6 7.04.02-2000 [2] suggested the statistical approaches for ensuring that the safety-related instrument setpoints were established and maintained within the technical specification limits [3]. However, Jang et al. [4] indicated that the preceding methodologies for a setpoint drift analysis might be insufficient to manage a setpoint drift on an instrumentation device and proposed new statistical analysis procedures for the management of a setpoint drift, based on the plant specific as-found/as-left data. Although IHPA (Instrument History Performance Analysis) is a widely known commercial software package to analyze an instrument setpoint drift, several steps in the new procedure cannot be performed by using it because it is based on the statistical approaches suggested in the ANSI/ISA-S67.04.01 -2000 [1] and ISA-RP67.04.02-2000 [2], In this paper we present a software package (SeDA: Setpoint Drift Analysis) that implements new methodologies, and which is easy to use, as it is accompanied by powerful graphical tools. (authors)

  6. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  7. Scilab software package for the study of dynamical systems

    Science.gov (United States)

    Bordeianu, C. C.; Beşliu, C.; Jipa, Al.; Felea, D.; Grossu, I. V.

    2008-05-01

    This work presents a new software package for the study of chaotic flows and maps. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well known examples are implemented, with the capability of the users inserting their own ODE. Program summaryProgram title: Chaos Catalogue identifier: AEAP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 885 No. of bytes in distributed program, including test data, etc.: 5925 Distribution format: tar.gz Programming language: Scilab 3.1.1 Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 100 Megabytes Classification: 6.2 Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations. The chaotic behavior of the nonlinear dynamical system is analyzed using Poincaré sections, phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropies. Restrictions: The package routines are normally able to handle ODE systems of high orders (up to order twelve and possibly higher), depending on the nature of the problem. Running time: 10 to 20 seconds for problems that do not

  8. GMATA: An Integrated Software Package for Genome-Scale SSR Mining, Marker Development and Viewing.

    Science.gov (United States)

    Wang, Xuewen; Wang, Le

    2016-01-01

    Simple sequence repeats (SSRs), also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA) integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar.

  9. Quantitative comparison and evaluation of two commercially available, two-dimensional electrophoresis image analysis software packages, Z3 and Melanie.

    Science.gov (United States)

    Raman, Babu; Cheung, Agnes; Marten, Mark R

    2002-07-01

    While a variety of software packages are available for analyzing two-dimensional electrophoresis (2-DE) gel images, no comparisons between these packages have been published, making it difficult for end users to determine which package would best meet their needs. The goal here was to develop a set of tests to quantitatively evaluate and then compare two software packages, Melanie 3.0 and Z3, in three of the fundamental steps involved in 2-DE image analysis: (i) spot detection, (ii) gel matching, and (iii) spot quantitation. To test spot detection capability, automatically detected protein spots were compared to manually counted, "real" protein spots. Spot matching efficiency was determined by comparing distorted (both geometrically and nongeometrically) gel images with undistorted original images, and quantitation tests were performed on artificial gels with spots of varying Gaussian volumes. In spot detection tests, Z3 performed better than Melanie 3.0 and required minimal user intervention to detect approximately 89% of the actual protein spots and relatively few extraneous spots. Results from gel matching tests depended on the type of image distortion used. For geometric distortions, Z3 performed better than Melanie 3.0, matching 99% of the spots, even for extreme distortions. For nongeometrical distortions, both Z3 and Melanie 3.0 required user intervention and performed comparably, matching 95% of the spots. In spot quantitation tests, both Z3 and Melanie 3.0 predicted spot volumes relatively well for spot ratios less than 1:6. For higher ratios, Melanie 3.0 did much better. In summary, results suggest Z3 requires less user intervention than Melanie 3.0, thus simplifying differential comparison of 2-DE gel images. Melanie 3.0, however, offers many more optional tools for image editing, spot detection, data reporting and statistical analysis than Z3. All image files used for these tests and updated information on the software are available on the internet

  10. Development of an engine system simulation software package - ESIM

    Energy Technology Data Exchange (ETDEWEB)

    Erlandsson, Olof

    2000-10-01

    A software package, ESIM is developed for simulating internal combustion engine systems, including models for engine, manifolds, turbocharger, charge-air cooler (inter cooler) and inlet air heater. This study focus on the thermodynamic treatment and methods used in the models. It also includes some examples of system simulations made with these models for validation purposes. The engine model can be classified as a zero-dimensional, single zone model. It includes calculation of the valve flow process, models for heat release and models for in-cylinder, exhaust port and manifold heat transfer. Models are developed for handling turbocharger performance and charge air cooler characteristics. The main purpose of the project related to this work is to use the ESIM software to study heat balance and performance of homogeneous charge compression ignition (HCCI) engine systems. A short description of the HCCI engine is therefore included, pointing out the difficulties, or challenges regarding the HCCI engine, from a system perspective. However, the relations given here, and the code itself, is quite general, making it possible to use these models to simulate spark ignited, as well as direct injected engines.

  11. Diffusion tensor imaging of the median nerve: intra-, inter-reader agreement, and agreement between two software packages

    International Nuclear Information System (INIS)

    Guggenberger, Roman; Nanz, Daniel; Puippe, Gilbert; Andreisek, Gustav; Rufibach, Kaspar; White, Lawrence M.; Sussman, Marshall S.

    2012-01-01

    To assess intra-, inter-reader agreement, and the agreement between two software packages for magnetic resonance diffusion tensor imaging (DTI) measurements of the median nerve. Fifteen healthy volunteers (seven men, eight women; mean age, 31.2 years) underwent DTI of both wrists at 1.5 T. Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) of the median nerve were measured by three readers using two commonly used software packages. Measurements were repeated by two readers after 6 weeks. Intraclass correlation coefficients (ICC) and Bland-Altman analysis were used for statistical analysis. ICCs for intra-reader agreement ranged from 0.87 to 0.99, for inter-reader agreement from 0.62 to 0.83, and between the two software packages from 0.63 to 0.82. Bland-Altman analysis showed no differences for intra- and inter-reader agreement and agreement between software packages. The intra-, inter-reader, and agreement between software packages for DTI measurements of the median nerve were moderate to substantial suggesting that user- and software-dependent factors contribute little to variance in DTI measurements. (orig.)

  12. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  13. The SAVI Vulnerability Analysis Software Package

    International Nuclear Information System (INIS)

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  14. Novel applications of the x-ray tracing software package McXtrace

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Nielsen, Martin Meedom; Haldrup, Kristoffer

    2014-01-01

    We will present examples of applying the X-ray tracing software package McXtrace to different kinds of X-ray scattering experiments. In particular we will be focusing on time-resolved type experiments. Simulations of full scale experiments are particularly useful for this kind, especially when...... some of the issues encountered. Generally more than one or all of these effects are present at once. Simulations can in these cases be used to identify distinct footprints of such distortions and thus give the experimenter a means of deconvoluting them from the signal. We will present a study...... of this kind along with the newest developments of the McXtrace software package....

  15. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  16. Integrated performance assessment model for waste package behavior and radionuclide release

    International Nuclear Information System (INIS)

    Kossik, R.; Miller, I.; Cunnane, M.

    1992-01-01

    Golder Associates Inc. (GAI) has developed a probabilistic total system performance assessment and strategy evaluation model (RIP) which can be applied in an iterative manner to evaluate repository site suitability and guide site characterization. This paper describes one component of the RIP software, the waste package behavior and radionuclide release model. The waste package component model considers waste package failure by various modes, matrix alteration/dissolution, and radionuclide mass transfer. Model parameters can be described as functions of local environmental conditions. The waste package component model is coupled to component models for far-field radionuclide transport and disruptive events. The model has recently been applied to the proposed repository at Yucca Mountain

  17. Methodologies for assessing long-term performance of high-level radioactive waste packages

    International Nuclear Information System (INIS)

    Stephens, K.; Boesch, L.; Crane, B.; Johnson, R.; Moler, R.; Smith, S.; Zaremba, L.

    1986-01-01

    Several methods the Nuclear Regulatory Commission (NRC) can use to independently assess Department of Energy (DOE) waste package performance were identified by The Aerospace Corporation. The report includes an overview of the necessary attributes of performance assessment, followed by discussions of DOE methods, probabilistic methods capable of predicting waste package lifetime and radionuclide releases, process modeling of waste package barriers, sufficiency of the necessary input data, and the applicability of probability density functions. It is recommended that the initial NRC performance assessment (for the basalt conceptual waste package design) should apply modular simulation, using available process models and data, to demonstrate this assessment method

  18. Graphical representation of ribosomal RNA probe accessibility data using ARB software package

    Directory of Open Access Journals (Sweden)

    Amann Rudolf

    2005-03-01

    Full Text Available Abstract Background Taxon specific hybridization probes in combination with a variety of commonly used hybridization formats nowadays are standard tools in microbial identification. A frequently applied technology, fluorescence in situ hybridization (FISH, besides single cell identification, allows the localization and functional studies of the microbial community composition. Careful in silico design and evaluation of potential oligonucleotide probe targets is therefore crucial for performing successful hybridization experiments. Results The PROBE Design tools of the ARB software package take into consideration several criteria such as number, position and quality of diagnostic sequence differences while designing oligonucleotide probes. Additionally, new visualization tools were developed to enable the user to easily examine further sequence associated criteria such as higher order structure, conservation, G+C content, transition-transversion profiles and in situ target accessibility patterns. The different types of sequence associated information (SAI can be visualized by user defined background colors within the ARB primary and secondary structure editors as well as in the PROBE Match tool. Conclusion Using this tool, in silico probe design and evaluation can be performed with respect to in situ probe accessibility data. The evaluation of proposed probe targets with respect to higher-order rRNA structure is of importance for successful design and performance of in situ hybridization experiments. The entire ARB software package along with the probe accessibility data is available from the ARB home page http://www.arb-home.de.

  19. THE SOFTWARE PACKAGE FOR DATA STREAM SCRAMBLING

    Directory of Open Access Journals (Sweden)

    P. A. Kadiev

    2016-01-01

    Full Text Available Abstract. It is proposed a software package for multivariate stepwise transformation of the text flow in order to increase resistance to protect against unauthorized access, and a package to restore the converted text. The basis of the proposals: the formation of nxn-array from the elements of a data flow, preliminary transposition of the array elements to form an array, each row and each column of which includes one and one only element from each row and each column of the source array, following reading on the options selected by the user.Package for direct conversion includes: a module for forming an array from the input flow; transposition module of array elements according to the scheme of Latin squares; reading module of rows or columns of the array to one of the following algorithms: sequential reading; reading of rows or columns with even indices and then odd ones;reading the row or column with odd indices, and then the even; reading at random route, which is generated by the program; reading at the route determined by the user.Package for restoring of the original message by the inverse transform comprises: a channel array forming module from the data flow; recovery module from the channel array - the array of Latin square type; the original array module; the original message restoring module. 

  20. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  1. REIDAC. A software package for retrospective dose assessment in internal contamination with radionuclides

    International Nuclear Information System (INIS)

    Kurihara, Osamu; Kanai, Katsuta; Takada, Chie; Takasaki, Koji; Ito, Kimio; Momose, Takumaro; Hato, Shinji; Ikeda, Hiroshi; Oeda, Mikihiro; Kurosawa, Naohiro; Fukutsu, Kumiko; Yamada, Yuji; Akashi, Makoto

    2007-01-01

    For cases of internal contamination with radionuclides, it is necessary to perform an internal dose assessment to facilitate radiation protection. For this purpose, the ICRP has supplied the dose coefficients and the retention and excretion rates for various radionuclides. However, these dosimetric quantities are calculated under typical conditions and are not necessarily detailed enough for dose assessment situations in which specific information on the incident or/and individual biokinetic characteristics could or should be taken into account retrospectively. This paper describes a newly developed PC-based software package called Retrospective Internal Dose Assessment Code (REIDAC) that meets the needs of retrospective dose assessment. REIDAC is made up of a series of calculation programs and a package of software. The former calculates the dosimetric quantities for any radionuclide being assessed and the latter provides a user with the graphical user interface (GUI) for executing the programs, editing parameter values and displaying results. The accuracy of REIDAC was verified by comparisons with dosimetric quantities given in the ICRP publications. This paper presents the basic structure of REIDAC and its calculation methods. Sensitivity analysis of the aerosol size for 239 Pu compounds and provisional calculations for wound contamination with 241 Am were performed as examples of the practical application of REIDAC. (author)

  2. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  3. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: II. Algorithms.

    Science.gov (United States)

    Appel, R D; Vargas, J R; Palagi, P M; Walther, D; Hochstrasser, D F

    1997-12-01

    After two generations of software systems for the analysis of two-dimensional electrophoresis (2-DE) images, a third generation of such software packages has recently emerged that combines state-of-the-art graphical user interfaces with comprehensive spot data analysis capabilities. A key characteristic common to most of these software packages is that many of their tools are implementations of algorithms that resulted from research areas such as image processing, vision, artificial intelligence or machine learning. This article presents the main algorithms implemented in the Melanie II 2-D PAGE software package. The applications of these algorithms, embodied as the feature of the program, are explained in an accompanying article (R. D. Appel et al.; Electrophoresis 1997, 18, 2724-2734).

  4. Development of a new control software package for Pakistan Research Reactor-2

    International Nuclear Information System (INIS)

    Qazi, M.K.

    1993-05-01

    The development of a new control software package for Pakistan Research Reactor-2 is presented. The software operates in different modes which comprises of surveillance, pre-operational self tests, operator, supervisor and robotic control. The control logic critically damp the system minimizing power overshoots. The software, handles multiple abnormal conditions, provides an elaborate access control and maintains startup/shutdown record. The report describes the functional details and covers the operational aspects of the new control software. (author)

  5. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Science.gov (United States)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  6. Integrated performance assessment model for waste policy package behavior and radionuclide release

    International Nuclear Information System (INIS)

    Kossik, R.; Miller, I.; Cunnane, M.

    1992-01-01

    Golder Associates Inc. (GAI) has developed a probabilistic total system performance assessment and strategy evaluation model (RIP) which can be applied in an iterative manner to evaluate repository site suitability and guide site characterization. This paper describes one component of the RIP software, the waste package behavior and radionuclide release model. The waste package component model considers waste package failure by various modes, matrix alteration/dissolution, and radionuclide mass transfer. Model parameters can be described as functions of local environmental conditions. The waste package component model is coupled to component models for far-field radionuclide transport and disruptive events. The model has recently been applied to the proposed repository at Yucca Mountain

  7. Computer aided piping layout design in radiochemical plants- an improved software package

    International Nuclear Information System (INIS)

    Raju, R.P.; Siddiqui, H.R.

    1995-01-01

    A software package was developed and it was successfully implemented for the piping layout design of the four process cells of the Kalpakkam Reprocessing Project. This paper discusses in detail all the improvements and modifications that are being carried out in the package so that it becomes more meaningful and useful for implementation for the forthcoming radiochemical plants

  8. The high performance cluster computing system for BES offline data analysis

    International Nuclear Information System (INIS)

    Sun Yongzhao; Xu Dong; Zhang Shaoqiang; Yang Ting

    2004-01-01

    A high performance cluster computing system (EPCfarm) is introduced, which used for BES offline data analysis. The setup and the characteristics of the hardware and software of EPCfarm are described. The PBS, a queue management package, and the performance of EPCfarm is presented also. (authors)

  9. ATLAS Offline Software Performance Monitoring and Optimization

    CERN Document Server

    Chauhan, N; Kittelmann, T; Langenberg, R; Mandrysch , R; Salzburger, A; Seuster, R; Ritsch, E; Stewart, G; van Eldik, N; Vitillo, R

    2014-01-01

    In a complex multi-developer, multi-package software environment, such as the ATLAS offline Athena framework, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide optimisation. Code can be instrumented firstly using the PAPI tool, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles and instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event gives a good understanding of the whole algorithm level performance of ATLAS code. Further data can be obtained using pin, a dynamic binary instrumentation tool. Pintools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is...

  10. ATLAS Offline Software Performance Monitoring and Optimization

    CERN Document Server

    Chauhan, N; The ATLAS collaboration; Kittelmann, T; Langenberg, R; Mandrysch , R; Salzburger, A; Seuster, R; Ritsch, E; Stewart, G; van Eldik, N; Vitillo, R

    2013-01-01

    In a complex multi-developer, multi-package software environment, such as the ATLAS offline Athena framework, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide optimisation. Code can be instrumented firstly using the PAPI tool, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles and instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event gives a good understanding of the whole algorithm level performance of ATLAS code. Further data can be obtained using pin, a dynamic binary instrumentation tool. Pintools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is...

  11. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  12. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Science.gov (United States)

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  13. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Directory of Open Access Journals (Sweden)

    Jeffrey K Noel

    2016-03-01

    Full Text Available Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  14. Data Packages for the Hanford Immobilized Low Activity Tank Waste Performance Assessment 2001 Version [SEC 1 THRU 5

    Energy Technology Data Exchange (ETDEWEB)

    MANN, F.M.

    2000-03-02

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.

  15. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Directory of Open Access Journals (Sweden)

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  16. Development of the processing software package for RPV neutron fluence determination methodology

    International Nuclear Information System (INIS)

    Belousov, S.; Kirilova, K.; Ilieva, K.

    2001-01-01

    According to the INRNE methodology the neutron transport calculation is carried out by two steps. At the first step reactor core eigenvalue calculation is performed. This calculation is used for determination of the fixed source for the next step calculation of neutron transport from the reactor core to the RPV. Both calculation steps are performed by state of the art and tested codes. The interface software package DOSRC developed at INRNE is used as a link between these two calculations. The package transforms reactor core calculation results to neutron source input data in format appropriate for the neutron transport codes (DORT, TORT and ASYNT) based on the discrete ordinates method. These codes are applied for calculation of the RPV neutron flux and its responses - induced activity, radiation damage, neutron fluence etc. Fore more precise estimation of the neutron fluence, the INRNE methodology has been supplemented by the next improvements: - implementation of more advanced codes (PYTHIA/DERAB) for neutron-physics parameter calculations; - more detailed neutron source presentation; - verification of neutron fluence by statistically treated experimental data. (author)

  17. Radiative transfer through terrestrial atmosphere and ocean: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Rozanov, A.V.; Kokhanovsky, A.A.; Burrows, J.P.

    2014-01-01

    SCIATRAN is a comprehensive software package for the modeling of radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40μm) including multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The software is capable of modeling spectral and angular distributions of the intensity or the Stokes vector of the transmitted, scattered, reflected, and emitted radiation assuming either a plane-parallel or a spherical atmosphere. Simulations are done either in the scalar or in the vector mode (i.e. accounting for the polarization) for observations by space-, air-, ship- and balloon-borne, ground-based, and underwater instruments in various viewing geometries (nadir, off-nadir, limb, occultation, zenith-sky, off-axis). All significant radiative transfer processes are accounted for. These are, e.g. the Rayleigh scattering, scattering by aerosol and cloud particles, absorption by gaseous components, and bidirectional reflection by an underlying surface including Fresnel reflection from a flat or roughened ocean surface. The software package contains several radiative transfer solvers including finite difference and discrete-ordinate techniques, an extensive database, and a specific module for solving inverse problems. In contrast to many other radiative transfer codes, SCIATRAN incorporates an efficient approach to calculate the so-called Jacobians, i.e. derivatives of the intensity with respect to various atmospheric and surface parameters. In this paper we discuss numerical methods used in SCIATRAN to solve the scalar and vector radiative transfer equation, describe databases of atmospheric, oceanic, and surface parameters incorporated in SCIATRAN, and demonstrate how to solve some selected radiative transfer problems using the SCIATRAN package. During the last decades, a lot of studies have been published demonstrating that SCIATRAN is a valuable

  18. MendelianRandomization: an R package for performing Mendelian randomization analyses using summarized data.

    Science.gov (United States)

    Yavorska, Olena O; Burgess, Stephen

    2017-12-01

    MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  19. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    Science.gov (United States)

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  20. PyPedal, an open source software package for pedigree analysis

    Science.gov (United States)

    The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...

  1. MEASURE/ANOMTEST. Anomaly detection software package for the Dodewaard power plant facility. Supplement 1. Extension of measurement analysis part, addition of plot package

    International Nuclear Information System (INIS)

    Schoonewelle, H.

    1995-01-01

    The anomaly detection software package installed at the Dodewaard nuclear power plant has been revised with respect to the part of the measurement analysis. A plot package has been added to the package. Signals in which an anomaly has been detected are automatically plotted including the uncertainty margins of the signals. This report gives a description of the revised measurement analysis part and the plot package. Each new routine of the plot package is described briefly and the new input and output files are given. (orig.)

  2. JMorph: Software for performing rapid morphometric measurements on digital images of fossil assemblages

    Science.gov (United States)

    Lelièvre, Peter G.; Grey, Melissa

    2017-08-01

    Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

  3. Information technologies and software packages for education of specialists in materials science [In Russian

    NARCIS (Netherlands)

    Krzhizhanovskaya, V.; Ryaboshuk, S.

    2009-01-01

    This paper presents methodological materials, interactive text-books and software packages developed and extensively used for education of specialists in materials science. These virtual laboratories for education and research are equipped with tutorials and software environment for modeling complex

  4. Package performance evaluation: our latest 30-year experience

    International Nuclear Information System (INIS)

    Malesys, Pierre; Gagner, Laurent

    2006-01-01

    Packages for the transport of radioactive material have to comply with national and / or international regulations. These regulations are widely based on the requirements set forth by the International Atomic Energy Agency (IAEA) in the 'Regulations for the Safe Transport of Radioactive Material'. The packages designed to transport the most demanding contents are submitted to tests for demonstrating their ability to withstand accident conditions of transport. These tests are typically: - a nine-meter drop onto a flat and unyielding surface, - a one-meter drop onto a punch, - a 800 deg. C / 30 minutes fire, and an immersion under a head of water of either 0.9 m, or 15 m or 200 m (depending of the criteria to be considered). During the last 20 years, on several of its package designs, COGEMA LOGISTICS has performed tests and analyses to simulate extremely severe accidents. These tests and analysis include: 1. long duration fire test and deep immersion test on a package designed to transport plutonium oxide powder; - 2. deep immersion tests on scale model of packages designed to transport spent fuel, high level vitrified waste and fresh MOX (uranium and plutonium mixed oxide) fuel; - 3. burial in a soft ground of packages designed to transport spent fuel; - 4. numerical study of the thermal behaviour of packages designed to transport spent fuel and high level vitrified waste; - 5. aircraft crash test on scale models of dual-purpose packages for the transport and storage of spent fuel. The paper will: - review the tests and analysis which were performed; - show that our designs are able to withstand extremely severe conditions; - demonstrate that there is no cliff effect: should a failure occurs, it appears gradually and there is no sudden collapse of the package; - explain how compliance with all the regulatory requirements lead to high performances regarding each of them (for instance, in many cases, the need to meet radiation exposure criteria induces a mechanical

  5. Chinshan living PRA model using NUPRA software package

    International Nuclear Information System (INIS)

    Cheng, S.-K.; Lin, T.-J.

    2004-01-01

    A living probabilistic risk assessment (PRA) model has been established for Chinshan Nuclear Power Station (BWR-4, MARK-I) using NUPRA software package. The core damage frequency due to internal events, seismic events and typhoons are evaluated in this model. The methodology and results considering the recent implementation of the 5th emergency diesel generator and automatic boron injection function are presented. The dominant sequences of this PRA model are discussed, and some possible applications of this living model are proposed. (author)

  6. Asset management: integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-06-01

    Two new multi-dimensional databases, which expand the `row and column` concept of spreadsheets into multiple categories of data called dimensions, are described. These integrated software packages provide the foundation for industry players such as Poco Petroleum Ltd and Numac Energy Inc to gain a competitive advantage, by overhauling their respective data collection and retrieval systems to allow for timely cost analysis and financial reporting. Energy Warehouse, an on-line analytical processing product marketed by SysGold Ltd, is one of the software products described. It gathers various sources of information, allows advanced searches and generates reports previously unavailable in other conventional financial accounting systems. The second product discussed - the Canadian Upstream Energy System (CUES) - is an on-line analytical processing system developed by Oracle Corporation and Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server and software development tools with ATS`s upstream financial, land, geotechnical and production applications. The software also allows for optimization of facilities, analysis of production efficiencies and comparison of performance against industry standards.

  7. A comparison of six software packages for evaluation of solid lung nodules using semi-automated volumetry: What is the minimum increase in size to detect growth in repeated CT examinations

    International Nuclear Information System (INIS)

    Hoop, Bartjan de; Gietema, Hester; Prokop, Mathias; Ginneken, Bram van; Zanen, Pieter; Groenewegen, Gerard

    2009-01-01

    We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules ≥8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages. (orig.)

  8. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V ampersand V guideline packages and procedures. Volume 5

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V ampersand V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, open-quotes User's Manual.close quotes Three factors determine what V ampersand V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V Guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems

  9. Analysing the Zenith Tropospheric Delay Estimates in On-line Precise Point Positioning (PPP) Services and PPP Software Packages.

    Science.gov (United States)

    Mendez Astudillo, Jorge; Lau, Lawrence; Tang, Yu-Ting; Moore, Terry

    2018-02-14

    As Global Navigation Satellite System (GNSS) signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP) technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD) is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS) tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm) with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.

  10. Analysing the Zenith Tropospheric Delay Estimates in On-line Precise Point Positioning (PPP Services and PPP Software Packages

    Directory of Open Access Journals (Sweden)

    Jorge Mendez Astudillo

    2018-02-01

    Full Text Available As Global Navigation Satellite System (GNSS signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.

  11. Browndye: A software package for Brownian dynamics

    Science.gov (United States)

    Huber, Gary A.; McCammon, J. Andrew

    2010-11-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. Program summaryProgram title: Browndye Catalogue identifier: AEGT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license, included in distribution No. of lines in distributed program, including test data, etc.: 143 618 No. of bytes in distributed program, including test data, etc.: 1 067 861 Distribution format: tar.gz Programming language: C++, OCaml ( http://caml.inria.fr/) Computer: PC, Workstation, Cluster Operating system: Linux Has the code been vectorised or parallelized?: Yes. Runs on multiple processors with shared memory using pthreads RAM: Depends linearly on size of physical system Classification: 3 External routines: uses the output of APBS [1] ( http://www.poissonboltzmann.org/apbs/) as input. APBS must be obtained and installed separately. Expat 2.0.1, CLAPACK, ocaml-expat, Mersenne Twister. These are included in the Browndye distribution. Nature of problem: Exploration and determination of rate constants of bimolecular interactions involving large biological molecules. Solution method: Brownian dynamics with electrostatic, excluded volume, van der Waals, and desolvation forces. Running time: Depends linearly on size of physical system and quadratically on precision of results. The included example executes in a few minutes.

  12. Multi-Language Programming Environments for High Performance Java Computing

    OpenAIRE

    Vladimir Getov; Paul Gray; Sava Mintchev; Vaidy Sunderam

    1999-01-01

    Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI) tool which provides ...

  13. TensorPack: a Maple-based software package for the manipulation of algebraic expressions of tensors in general relativity

    International Nuclear Information System (INIS)

    Huf, P A; Carminati, J

    2015-01-01

    In this paper we: (1) introduce TensorPack, a software package for the algebraic manipulation of tensors in covariant index format in Maple; (2) briefly demonstrate the use of the package with an orthonormal tensor proof of the shearfree conjecture for dust. TensorPack is based on the Riemann and Canon tensor software packages and uses their functions to express tensors in an indexed covariant format. TensorPack uses a string representation as input and provides functions for output in index form. It extends the functionality to basic algebra of tensors, substitution, covariant differentiation, contraction, raising/lowering indices, symmetry functions and other accessory functions. The output can be merged with text in the Maple environment to create a full working document with embedded dynamic functionality. The package offers potential for manipulation of indexed algebraic tensor expressions in a flexible software environment. (paper)

  14. SWISTRACK - AN OPEN SOURCE, SOFTWARE PACKAGE APPLICABLE TO TRACKING OF FISH LOCOMOTION AND BEHAVIOUR

    DEFF Research Database (Denmark)

    Steffensen, John Fleng

    2010-01-01

    including swimming speed, acceleration and directionality of movements as well as the examination of locomotory panems during swimming. SWiSlrdL:k, a [n: t; and downloadable software package (available from www.sourceforge.com) is widely used for tracking robots, humans and other animals. Accordingly......, Swistrack can be easily adopted for the tracking offish. Benefits associated with the free software include: • Contrast or marker based tracking enabling tracking of either the whole animal, or tagged marks placed upon the animal • The ability to track multiple tags placed upon an individual animal • Highly...... effective background subtraction algorithms and filters ensuring smooth tracking of fish • Application of tags of different colour enables the software to track multiple fish without the problem of track exchange between individuals • Low processing requirements enable tracking in real-time • Further...

  15. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Michael T. [Illinois Rocstar LLC, Champaign, IL (United States); Safdari, Masoud [Illinois Rocstar LLC, Champaign, IL (United States); Kress, Jessica E. [Illinois Rocstar LLC, Champaign, IL (United States); Anderson, Michael J. [Illinois Rocstar LLC, Champaign, IL (United States); Horvath, Samantha [Illinois Rocstar LLC, Champaign, IL (United States); Brandyberry, Mark D. [Illinois Rocstar LLC, Champaign, IL (United States); Kim, Woohyun [Illinois Rocstar LLC, Champaign, IL (United States); Sarwal, Neil [Illinois Rocstar LLC, Champaign, IL (United States); Weisberg, Brian [Illinois Rocstar LLC, Champaign, IL (United States)

    2016-10-15

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enable coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site

  16. Software Package for the Technical Support Centre

    International Nuclear Information System (INIS)

    Tomisa, T.; Skanata, D.; Sucic, B.

    2002-01-01

    The continued radiological surveillance system has been technically improved during the last two years by establishing 11 new automatic stations, so that there are currently 14 locations with installed gamma-monitors for air radiation monitoring on the Croatian national territory. Given that the original system had been designed primarily for gathering data for off-line treatment with the purpose of statistical analyses, the contemporary Radiological Early Warning System (SPRU) approach has demanded developing of a new software by the Technical Support Centre (TPC) in order to allow operators interactive work in the case of emergency situations. The outcome of this development is a software package called DORAP (Automatic Radiological Station Remote Reading), which brings together automatic functions of continual data gathering, daily production of the standard report, distribution of the report by fax, SMS (Short Message Service), SMT (Simple Mail Transfer) and FTP (File Transfer Protocol) as well as generation and distribution of alarms in the case of failure in the system or exceeding of the set radiation intensity values. (author)

  17. Recent developments on PLASMAKIN - a software package to model the kinetics in gas discharges

    International Nuclear Information System (INIS)

    Pinhao, N R

    2009-01-01

    PLASMAKIN is a user-friendly software package to handle physical and chemical data used in plasma physics modeling and to compute the production and destruction terms in fluid models equations. These terms account for the particle or energy production and loss rates due to gas-phase and gas-surface reactions. The package has been restructured and expanded to (a) allow the simulation of atomic emission spectra taking into account line broadening processes and radiation trapping; (b) include a library to compute the electron kinetics; (c) include a database of species properties and reactions and, (d) include a Python interface to allow access from scripts and integration with other scientific software tools.

  18. The UK core performance code package

    International Nuclear Information System (INIS)

    Hutt, P.K.; Gaines, N.; McEllin, M.; White, R.J.; Halsall, M.J.

    1991-01-01

    Over the last few years work has been co-ordinated by Nuclear Electric, originally part of the Central Electricity Generating Board, with contributions from the United Kingdom Atomic Energy Authority and British Nuclear Fuels Limited, to produce a generic, easy-to-use and integrated package of core performance codes able to perform a comprehensive range of calculations for fuel cycle design, safety analysis and on-line operational support for Light Water Reactor and Advanced Gas Cooled Reactor plant. The package consists of modern rationalized generic codes for lattice physics (WIMS), whole reactor calculations (PANTHER), thermal hydraulics (VIPRE) and fuel performance (ENIGMA). These codes, written in FORTRAN77, are highly portable and new developments have followed modern quality assurance standards. These codes can all be run ''stand-alone'' but they are also being integrated within a new UNIX-based interactive system called the Reactor Physics Workbench (RPW). The RPW provides an interactive user interface and a sophisticated data management system. It offers quality assurance features to the user and has facilities for defining complex calculational sequences. The Paper reviews the current capabilities of these components, their integration within the package and outlines future developments underway. Finally, the Paper describes the development of an on-line version of this package which is now being commissioned on UK AGR stations. (author)

  19. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    Science.gov (United States)

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All

  20. Cross-Platform Learning Media Development of Software Installation on Computer Engineering and Networking Expertise Package

    Directory of Open Access Journals (Sweden)

    Afis Pratama

    2018-03-01

    Full Text Available Software Installation is one of the important lessons that must be mastered by student of computer and network engineering expertise package. But there is a problem about the lack of attention and concentration of students in following the teaching and learning process in the subject of installation of the software. The matter must immediately find a solution. This research refers to the technology development that is always increasing. The technology can be used as a tool to support learning activities. Currently, all grade 10 students in public vocational high school (SMK 8 Semarang Indonesia already have a gadget, either a smartphone or a laptop and the intensity of usage is high enough. Based on this phenomenon, this research aims to create a learning media software installation that is cross-platform. It is practical and can be carried easily in a smartphone and a laptop that has different operating system. So that, this media is expected to improve learning outcomes, understanding and enthusiasm of the students in the software installation lesson.

  1. Investigating the effects of different factors on development of open source enterprise resources planning software packages

    Directory of Open Access Journals (Sweden)

    Mehdi Ghorbaninia

    2014-08-01

    Full Text Available This paper investigates the effects of different factors on development of open source enterprise resources planning software packages. The study designs a questionnaire in Likert scale and distributes it among 210 experts in the field of open source software package development. Cronbach alpha has been calculated as 0.93, which is well above the minimum acceptable level. Using Pearson correlation as well as stepwise regression analysis, the study determines three most important factors including fundamental issues, during and after implementation of open source software development. The study also determines a positive and strong relationship between fundamental factors and after implementation factors (r=0.9006, Sig. = 0.000.

  2. Waste package performance analysis

    International Nuclear Information System (INIS)

    Lester, D.H.; Stula, R.T.; Kirstein, B.E.

    1982-01-01

    A performance assessment model for multiple barrier packages containing unreprocessed spent fuel has been applied to several package designs. The resulting preliminary assessments were intended for use in making decisions about package development programs. A computer model called BARIER estimates the package life and subsequent rate of release of selected nuclides. The model accounts for temperature, pressure (and resulting stresses), bulk and localized corrosion, and nuclide retardation by the backfill after water intrusion into the waste form. The assessment model assumes a post-closure, flooded, geologic repository. Calculations indicated that, within the bounds of model assumptions, packages could last for several hundred years. Intact backfills of appropriate design may be capable of nuclide release delay times on the order of 10 7 yr for uranium, plutonium, and americium. 8 references, 6 figures, 9 tables

  3. High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software

    Science.gov (United States)

    Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.

    2013-08-01

    GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.

  4. Software packages for simulating groundwater flow and the spreading of soluble and insoluble admixtures in aquifers

    International Nuclear Information System (INIS)

    Roshal, A.A.; Klein, I.S.; Svishchov, A.M.

    1993-01-01

    Software programs are described designed for solving hydrogeological and environmental problems related to the analysis and prediction of groundwater flow and the spreading of solutes and insolubles in the saturated zones. The software package GWFS (Ground Water Flow Simulation) allows for simulating steady-state and unsteady-state flow in confined, unconfined, and confined-unconfined multi-layer and quasi-3D isotropic and anisotropic aquifer systems. Considered are intra-layer sources and sinks, infiltration, inter-layer leakages, the interrelationships with surface reservoirs and streams, interrelationships with the drains, aquifer discharge to surface sources. The MTS (Mass Transport Simulation) package is designed for solving solute transport problems. Taken into account is convective transport, hydrodynamic dispersion and diffusion, linear equilibrium sorption. The method of characteristics is being implemented here using the ''particles-in-cells'' scheme in which the transport is modeled with the help of tracers. The software package OWFS (Oil-Water Flow Simulation) is designed for the simulation of hydrocarbon (oil-water) migration in aquifers

  5. DISPL: a software package for one and two spatially dimensioned kinetics-diffusion problems. [FORTRAN for IBM computers

    Energy Technology Data Exchange (ETDEWEB)

    Leaf, G K; Minkoff, M; Byrne, G D; Sorensen, D; Bleakney, T; Saltzman, J

    1978-11-01

    DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous media. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 17 figures, 9 tables.

  6. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  7. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  8. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    International Nuclear Information System (INIS)

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  9. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    International Nuclear Information System (INIS)

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z

    2011-01-01

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 μm each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  10. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    Energy Technology Data Exchange (ETDEWEB)

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z, E-mail: daniel.turecek@utef.cvut.cz [Institute of Experimental and Applied Physics, Czech Technical University in Prague, Horska 3a/22, 12800 Prague 2 (Czech Republic)

    2011-01-15

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 {mu}m each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  11. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    International Nuclear Information System (INIS)

    Pascoal, A; Lawinski, C P; Honey, I; Blake, P

    2005-01-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA detector , which indicates the potential to use the software CDRAD analyser for assessment of relative IQ

  12. MOlecular MAterials Property Prediction Package (MOMAP) 1.0: a software package for predicting the luminescent properties and mobility of organic functional materials

    Science.gov (United States)

    Niu, Yingli; Li, Wenqiang; Peng, Qian; Geng, Hua; Yi, Yuanping; Wang, Linjun; Nan, Guangjun; Wang, Dong; Shuai, Zhigang

    2018-04-01

    MOlecular MAterials Property Prediction Package (MOMAP) is a software toolkit for molecular materials property prediction. It focuses on luminescent properties and charge mobility properties. This article contains a brief descriptive introduction of key features, theoretical models and algorithms of the software, together with examples that illustrate the performance. First, we present the theoretical models and algorithms for molecular luminescent properties calculation, which includes the excited-state radiative/non-radiative decay rate constant and the optical spectra. Then, a multi-scale simulation approach and its algorithm for the molecular charge mobility are described. This approach is based on hopping model and combines with Kinetic Monte Carlo and molecular dynamics simulations, and it is especially applicable for describing a large category of organic semiconductors, whose inter-molecular electronic coupling is much smaller than intra-molecular charge reorganisation energy.

  13. A software package for patient-specific dosimetry in the locoregional RIT of gliomas using 188Re labelled NIMOTUZUMAB

    International Nuclear Information System (INIS)

    Torres, L.A.; Coca, M.A.; Sanchez, Y.; Cornejo, N.; Catasus, C.; Denaro, M. de

    2008-01-01

    Full text: The locoregional treatment of high-grade gliomas using beta emitter compounds allows delivering high radiation doses in the tumor bed and the brain adjacent tissues of patients suffering these aggressive malignancies. The main goal of this work was to implement patient-specific dosimetry procedures using a voxel-based methodology in order to compute and analyze the three-dimensional doses distributions received by the patients undergoing loco-regional treatment of gliomas with the 188 Re labeled MAb NIMOTUZUMAB. A software package called TRIDOSE has been developed to perform the image managing, volume registration, dose calculations and qualitative and quantitative analysis of the results, including dose-volume histograms and isodose curves. The dosimetric factors at voxel level for 188 Re ('S' values) were estimated using two different methods, Monte Carlo simulations of energy transport and deposition and the integration of the dose kernel functions. A quality control module was also implemented in order to test the software using well-known 3D distribution of activities or counts. The TRIDOSE outputs were compared with other commercial software showing relative differences lower than 1.10% for different sphere sizes. The established dosimetric procedures constitute a useful tool to compute the absorbed doses received by patients undergoing radioimmunotherapy of brain tumors with 188 Re-NIMOTUZUMAB. (author)

  14. Current status and future direction of the MONK software package

    International Nuclear Information System (INIS)

    Smith, Nigel; Armishaw, Malcolm; Cooper, Andrew

    2003-01-01

    The current status of the MONK criticality software package is summarized in terms of recent and current developments and envisaged directions for the future. The areas of the discussion are physics modeling, geometry modeling, source modeling, nuclear data, validation, supporting tools and customer services. In future development plan, MONK continues to be focused on meeting the short and long-term needs of the code user community. (J.P.N.)

  15. Nonlinear analysis of reinforced concrete structures using software package abaqus

    OpenAIRE

    Marković Nemanja; Stojić Dragoslav; Cvetković Radovan

    2014-01-01

    Reinforced concrete (AB) is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP), Smeared Concrete Cr...

  16. ACTS: from ATLAS software towards a common track reconstruction software

    Science.gov (United States)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  17. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Energy Technology Data Exchange (ETDEWEB)

    Ashraf, H.; Bach, K.S.; Hansen, H. [Copenhagen University, Department of Radiology, Gentofte Hospital, Hellerup (Denmark); Hoop, B. de [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Shaker, S.B.; Dirksen, A. [Copenhagen University, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Prokop, M. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Radboud University Nijmegen, Department of Radiology, Nijmegen (Netherlands); Pedersen, J.H. [Copenhagen University, Department of Cardiothoracic Surgery RT, Rigshospitalet, Copenhagen (Denmark)

    2010-08-15

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  18. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    International Nuclear Information System (INIS)

    Ashraf, H.; Bach, K.S.; Hansen, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  19. Advances in the development of the PIXEKLM-TPI software package

    International Nuclear Information System (INIS)

    Uzonyi, I.; Szabo, Gy.

    2005-01-01

    Complete text of publication follows. During the past decade great effort has been devoted to the developments of various local analytical methods which are capable to analyze small volumes of a sample (in the range of some μm 3 ) by high lateral and/or depth resolution. Among the Ion Beam Analytical (IBA) methods, Particle Induced X-Ray Fluorescence Emission (PIXE) analysis has been used for qualitative elemental imaging for a long time. Nevertheless, production of quantitative images is still a challenging and unresolved problem in general. Ryan and his co-workers were the first who developed a software package (GeoPIXE) for on-line quantitative mapping which is capable to analyze especially thick samples. Some years ago we also started to develop quantitative PIXE imaging software and suggested a different approach for the compensation of matrix effects and sample thickness. It is based on the rapid matrix transform method called Dynamic Analysis which directly converts the spectrum vector (S) into the concentration vector (C) in terms of the matrix Γ. We modified the earlier version of the PIXEKLM program in order to calculate the Γ matrix for materials of any thickness. Furthermore, we have developed a windows-based program (True PIXE Imaging, TPI) which calculates elemental distributions on a pixel by pixel basis and creates so called elemental images from them in bit map form using colour bars. The basic part of the new program package was published in 2005. During the past year much efforts has been devoted to develop various new options such as visualization of spectrum components in order to make the program more user-friendly and applicable. In the figure below the decomposed PIXE spectrum of an industrial material is visualized. (author)

  20. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  1. A PC-based software package for modeling DOE mixed-waste management options

    International Nuclear Information System (INIS)

    Abashian, M.S.; Carney, C.; Schum, K.

    1995-02-01

    The U.S. Department of Energy (DOE) Headquarters and associated contractors have developed an IBM PC-based software package that estimates costs, schedules, and public and occupational health risks for a range of mixed-waste management options. A key application of the software package is the comparison of various waste-treatment options documented in the draft Site Treatment Plans prepared in accordance with the requirements of the Federal Facility Compliance Act of 1992. This automated Systems Analysis Methodology consists of a user interface for configuring complexwide or site-specific waste-management options; calculational algorithms for cost, schedule and risk; and user-selected graphical or tabular output of results. The mixed-waste management activities modeled in the automated Systems Analysis Methodology include waste storage, characterization, handling, transportation, treatment, and disposal. Analyses of treatment options identified in the draft Site Treatment Plans suggest potential cost and schedule savings from consolidation of proposed treatment facilities. This paper presents an overview of the automated Systems Analysis Methodology

  2. The quality and testing PH-SFT infrastructure for the external LHC software packages deployment

    CERN Multimedia

    CERN. Geneva; MENDEZ LORENZO, Patricia; MATO VILA, Pere

    2015-01-01

    The PH-SFT group is responsible for the build, test, and deployment of the set of external software packages used by the LHC experiments. This set includes ca. 170 packages including Grid packages and Montecarlo generators provided for different versions. A complete build structure has been established to guarantee the quality of the packages provided by the group. This structure includes an experimental build and three daily nightly builds, each of them dedicated to a specific ROOT version including v6.02, v6.04, and the master. While the former build is dedicated to the test of new packages, versions and dependencies (basically SFT internal used), the three latter ones are the responsible for the deployment to AFS of the set of stable and well tested packages requested by the LHC experiments so they can apply their own builds on top. In all cases, a c...

  3. MAPPIX: A software package for off-line micro-pixe single particle aerosol analysis

    International Nuclear Information System (INIS)

    Ceccato, D.

    2009-01-01

    In the framework of a multiannual experiment performed at Baia Terra Nova, Antarctica, size-segregated aerosol samples were collected by using a 12-stage SDI impactor (Hillamo design). Approximately 2800 particles, belonging to the first four supermicrometric SDI stages - 8.39, 4.08, 2.68, 1.66 μm dynamic aerosol diameter cuts - were analyzed at the INFN-LNL micro-PIXE facility, a three lens Oxford Microprobe (OM) product, installed in the early nineties. Four regions on each of the 12 sub-samples were measured; 60 aerosol particles were detected on average in each of the analyzed regions. The off-line single aerosol particle (SAP) analysis of such big amount of data required software that is able to rapidly handle the acquired data, with a simple and fast area selection procedure; the subsequent automated PIXE spectra analysis with a specialized code was also needed. The MAPPIX 2.0 software was designed to make easier and faster the user jobs during the SAP analysis. The package is composed of two separate routines: the first one is devoted to data format conversion (OM-LMF file format to MAPPIX format), while the second one is devoted to micro-PIXE maps graphical presentation and aerosol particle selection procedure. The MAPPIX data format and software features will be discussed; a short report of the speed performances will be presented.

  4. Quantitation of magnetic resonance spectroscopy signals: the jMRUI software package

    Czech Academy of Sciences Publication Activity Database

    Stefan, D.; Di Cesare, F.; Andrasescu, A.; Popa, E.; Lazariev, A.; Vescovo, E.; Štrbák, Oliver; Williams, S.; Starčuk jr., Zenon; Cabanas, M.; van Ormondt, D.; Graveron-Demilly, D.

    2009-01-01

    Roč. 20, č. 10 (2009), 104035:1-9 ISSN 0957-0233 Grant - others:EC 6FP(XE) MRTN-CT-2006-035801 Source of funding: R - rámcový projekt EK Keywords : MR spectroscopy * MRS * MRSI * HRMAS-NMR * jMRUI software package * Java * plug-ins * quantitation Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.317, year: 2009

  5. Performance analysis of conceptual waste package designs in salt repositories

    International Nuclear Information System (INIS)

    Jansen, G. Jr.; Raines, G.E.; Kircher, J.F.

    1984-01-01

    A performance analysis of commercial high-level waste and spent fuel conceptual package designs in reference repositories in three salt formations was conducted with the WAPPA waste package code. Expected conditions for temperature, stress, brine composition, radiation level, and brine flow rate were used as boundary conditions to compute expected corrosion of a thick-walled overpack of 1025 wrought steel. In all salt formations corrosion by low Mg salt-dissolution brines typical of intrusion scenarios was too slow to cause the package to fail for thousands of years after burial. In high Mg brines judged typical of thermally migrating brines in bedded salt formations, corrosion rates which would otherwise have caused the packages to fail within a few hundred years were limited by brine availability. All of the brine reaching the package was consumed by reaction with the iron in the overpack, thus preventing further corrosion. Uniform brine distribution over the package surface was an important factor in predicting long package lifetimes for the high Mg brines. 14 references, 15 figures

  6. Strategy and Software Application of Fresh Produce Package Design to Attain Optimal Modified Atmosphere

    Directory of Open Access Journals (Sweden)

    Dong Sun Lee

    2014-01-01

    Full Text Available Modified atmosphere packaging of fresh produce relies on the attainment of desired gas concentration inside the package resulting from product respiration and package’s gas transfer. Systematic package design method to achieve the target modified atmosphere was developed and constructed as software in terms of selecting the most appropriate film, microperforations, and/or CO2 scavenger. It incorporates modeling and/or database construction on the produce respiration, gas transfer across the plastic film and microperforation, and CO2 absorption by the scavenger. The optimization algorithm first selects the packaging film and/or microperforations to have the target O2 concentration in response to the respiration and then tunes the CO2 concentration by CO2 absorber when it goes above its tolerance limit. The optimization method tested for green pepper, strawberry, and king oyster mushroom packages was shown to be effective to design the package and the results obtained were consistent with literature work and experimental atmosphere.

  7. METEOR v1.0 - Design and structure of the software package

    International Nuclear Information System (INIS)

    Palomo, E.

    1994-01-01

    This script describes the structure and the separated modules of the software package METEOR for the statistical analysis of meteorological data series. It contains a systematic description of the subroutines of METEOR and, also, of the required shape for input and output files. The original version of METEOR have been developed by Ph.D. Elena Palomo, CIEMAT-IER, GIMASE. It is built by linking programs and routines written in FORTRAN 77 and it adds thc graphical capabilities of GNUPLOT. The shape of this toolbox was designed following the criteria of modularity, flexibility and agility criteria. All the input, output and analysis options are structured in three main menus: i) the first is aimed to evaluate the quality of the data set; ii) the second is aimed for pre-processing of the data; and iii) the third is aimed towards the statistical analyses and for creating the graphical outputs. Actually the information about METEOR is constituted by three documents written in spanish: 1) METEOR v1.0: User's guide; 2) METEOR v1.0: A usage example; 3) METEOR v 1.0: Design and structure of the software package. (Author)

  8. CT and MR perfusion can discriminate severe cerebral hypoperfusion from perfusion absence: evaluation of different commercial software packages by using digital phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Uwano, Ikuko; Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Advanced Medical Research Center, Morioka (Japan); Christensen, Soren [University of Melbourne, Royal Melbourne Hospital, Departments of Neurology and Radiology, Victoria (Australia); Oestergaard, Leif [Aarhus University Hospital, Department of Neuroradiology, Center for Functionally Integrative Neuroscience, DK, Aarhus C (Denmark); Ogasawara, Kuniaki; Ogawa, Akira [Iwate Medical University, Department of Neurosurgery, Morioka (Japan)

    2012-05-15

    Computed tomography perfusion (CTP) and magnetic resonance perfusion (MRP) are expected to be usable for ancillary tests of brain death by detection of complete absence of cerebral perfusion; however, the detection limit of hypoperfusion has not been determined. Hence, we examined whether commercial software can visualize very low cerebral blood flow (CBF) and cerebral blood volume (CBV) by creating and using digital phantoms. Digital phantoms simulating 0-4% of normal CBF (60 mL/100 g/min) and CBV (4 mL/100 g/min) were analyzed by ten software packages of CT and MRI manufacturers. Region-of-interest measurements were performed to determine whether there was a significant difference between areas of 0% and areas of 1-4% of normal flow. The CTP software detected hypoperfusion down to 2-3% in CBF and 2% in CBV, while the MRP software detected that of 1-3% in CBF and 1-4% in CBV, although the lower limits varied among software packages. CTP and MRP can detect the difference between profound hypoperfusion of <5% from that of 0% in digital phantoms, suggesting their potential efficacy for assessing brain death. (orig.)

  9. A study on the performance advancement of teat algorithm for defects in semiconductor packages

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeol; Kim, Chang Hyun; Yang, Dong Jo; Ko, Myung Soo [Chosun University, Gwangju (Korea, Republic of); You, Sin [Computer Added Mechanical Engineering, Mokpo Science College, Mokpo (Korea, Republic of)

    2002-11-15

    In this study, researchers classifying the artificial flaws in semiconductor packages are performed by pattern recognition technology. For this purposes, image pattern recognition package including the user made software was developed and total procedure including ultrasonic image acquisition, equalization filtration, binary process, edge detection and classifier design is treated by Backpropagation Neural Network. Specially, it is compared with various weights of Backpropagation Neural Network and it is compared with threshold level of edge detection in preprocessing method for entrance into Multi-Layer Perceptron(Backpropagation Neural network). Also, tile pattern recognition techniques is applied to the classification problem of defects in semiconductor packages as normal, crack, delamination. According to this results, it is possible to acquire the recognition rate of 100% for Backpropagation Neural Network.

  10. Cost optimization of load carrying thin-walled precast high performance concrete sandwich panels

    DEFF Research Database (Denmark)

    Hodicky, Kamil; Hansen, Sanne; Hulin, Thomas

    2015-01-01

    and HPCSP’s geometrical parameters as well as on material cost function in the HPCSP design. Cost functions are presented for High Performance Concrete (HPC), insulation layer, reinforcement and include labour-related costs. The present study reports the economic data corresponding to specific manufacturing......The paper describes a procedure to find the structurally and thermally efficient design of load-carrying thin-walled precast High Performance Concrete Sandwich Panels (HPCSP) with an optimal economical solution. A systematic optimization approach is based on the selection of material’s performances....... The solution of the optimization problem is performed in the computer package software Matlab® with SQPlab package and integrates the processes of HPCSP design, quantity take-off and cost estimation. The proposed optimization process outcomes in complex HPCSP design proposals to achieve minimum cost of HPCSP....

  11. Waste package performance in unsaturated rock

    International Nuclear Information System (INIS)

    Pigford, T.H.; Lee, W.W.-L.

    1989-03-01

    The unsaturated rock and near-atmospheric pressure of the potential nuclear waste repository at Yucca Mountain present new problems of predicting waste package performance. In this paper we present some illustrations of predictions of waste package performance and discuss important data needs. 11 refs., 9 figs., 1 tab

  12. PAINeT: An object-oriented software package for simulations of flow-field, transport coefficients and flux terms in non-equilibrium gas mixture flows

    Science.gov (United States)

    Istomin, V. A.

    2018-05-01

    The software package Planet Atmosphere Investigator of Non-equilibrium Thermodynamics (PAINeT) has been devel-oped for studying the non-equilibrium effects associated with electronic excitation, chemical reactions and ionization. These studies are necessary for modeling process in shock tubes, in high enthalpy flows, in nozzles or jet engines, in combustion and explosion processes, in modern plasma-chemical and laser technologies. The advantages and possibilities of the package implementation are stated. Within the framework of the package implementation, based on kinetic theory approximations (one-temperature and state-to-state approaches), calculations are carried out, and the limits of applicability of a simplified description of shock-heated air flows and any other mixtures chosen by the user are given. Using kinetic theory algorithms, a numerical calculation of the heat fluxes and relaxation terms can be performed, which is necessary for further comparison of engineering simulation with experi-mental data. The influence of state-to-state distributions over electronic energy levels on the coefficients of thermal conductivity, diffusion, heat fluxes and diffusion velocities of the components of various gas mixtures behind shock waves is studied. Using the software package the accuracy of different approximations of the kinetic theory of gases is estimated. As an example state-resolved atomic ionized mixture of N/N+/O/O+/e- is considered. It is shown that state-resolved diffusion coefficients of neutral and ionized species vary from level to level. Comparing results of engineering applications with those given by PAINeT, recommendations for adequate models selection are proposed.

  13. GERMINATOR: a software package for high-throughput scoring and curve fitting of Arabidopsis seed germination.

    Science.gov (United States)

    Joosen, Ronny V L; Kodde, Jan; Willems, Leo A J; Ligterink, Wilco; van der Plas, Linus H W; Hilhorst, Henk W M

    2010-04-01

    Over the past few decades seed physiology research has contributed to many important scientific discoveries and has provided valuable tools for the production of high quality seeds. An important instrument for this type of research is the accurate quantification of germination; however gathering cumulative germination data is a very laborious task that is often prohibitive to the execution of large experiments. In this paper we present the germinator package: a simple, highly cost-efficient and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The germinator package contains three modules: (i) design of experimental setup with various options to replicate and randomize samples; (ii) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (iii) curve fitting of cumulative germination data and the extraction, recap and visualization of the various germination parameters. The curve-fitting module enables analysis of general cumulative germination data and can be used for all plant species. We show that the automatic scoring system works for Arabidopsis thaliana and Brassica spp. seeds, but is likely to be applicable to other species, as well. In this paper we show the accuracy, reproducibility and flexibility of the germinator package. We have successfully applied it to evaluate natural variation for salt tolerance in a large population of recombinant inbred lines and were able to identify several quantitative trait loci for salt tolerance. Germinator is a low-cost package that allows the monitoring of several thousands of germination tests, several times a day by a single person.

  14. Lenstronomy: Multi-purpose gravitational lens modeling software package

    Science.gov (United States)

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  15. The consequences of a new software package for the quantification of gated-SPECT myocardial perfusion studies

    International Nuclear Information System (INIS)

    Veen, Berlinda J. van der; Dibbets-Schneider, Petra; Stokkel, Marcel P.M.; Scholte, Arthur J.

    2010-01-01

    Semiquantitative analysis of myocardial perfusion scintigraphy (MPS) has reduced inter- and intraobserver variability, and enables researchers to compare parameters in the same patient over time, or between groups of patients. There are several software packages available that are designed to process MPS data and quantify parameters. In this study the performances of two systems, quantitative gated SPECT (QGS) and 4D-MSPECT, in the processing of clinical patient data and phantom data were compared. The clinical MPS data of 148 consecutive patients were analysed using QGS and 4D-MSPECT to determine the end-diastolic volume, end-systolic volume and left ventricular ejection fraction. Patients were divided into groups based on gender, body mass index, heart size, stressor type and defect type. The AGATE dynamic heart phantom was used to provide reference values for the left ventricular ejection fraction. Although the correlations were excellent (correlation coefficients 0.886 to 0.980) for all parameters, significant differences (p < 0.001) were found between the systems. Bland-Altman plots indicated that 4D-MSPECT provided overall higher values of all parameters than QGS. These differences between the systems were not significant in patients with a small heart (end-diastolic volume <70 ml). Other clinical factors had no direct influence on the relationship. Additionally, the phantom data indicated good linear responses of both systems. The discrepancies between these software packages were clinically relevant, and influenced by heart size. The possibility of such discrepancies should be taken into account when a new quantitative software system is introduced, or when multiple software systems are used in the same institution. (orig.)

  16. UES: an optimization software package for power and energy

    International Nuclear Information System (INIS)

    Vohryzek, J.; Havlena, V.; Findejs, J.; Jech, J.

    2004-01-01

    Unified Energy Solutions components are designed to meet specific requirements of the electric utilities, industrial power units, and district heating (combined heat and power) plants. The optimization objective is to operate the plant with maximum process efficiency and operational profit under the constraints imposed by technology and environmental impacts. Software applications for advanced control real-time optimization may provide a low-cost, high return alternative to expensive boiler retrofits for improving operational profit as well as reducing emissions. Unified Energy Solutions (UES) software package is a portfolio of advanced control and optimization components running on top of the standard process regulatory and control system. The objective of the UES is to operate the plant with maximum achievable profit (maximum efficiency) under the constraints imposed by technology (life-time consumption, asset health) and environmental impacts (CO and NO x emissions). Fast responsiveness to varying economic conditions and integration of real-time optimization and operator decision support (off-line) features are critical for operation in real-time economy. Optimization Features are targeted to combustion process, heat and power load allocation to parallel resources, electric power delivery and ancillary services. Optimization Criteria include increased boiler thermal efficiency, maintaining emission limits, economic load allocation of the heat and generation sources. State-of-the-art advanced control algorithms use model based predictive control principles and provide superior response in transient states. Individual software modules support open control platforms and communication protocols. UES can be implemented on a wide range of distributed control systems. Typical achievable benefits include heat and power production costs savings, increased effective boiler operation range, optimized flue gas emissions, optimized production capacity utilization, optimized

  17. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  18. Using packaged software for solving two differential equation problems that arise in plasma physics

    International Nuclear Information System (INIS)

    Gaffney, P.W.

    1980-01-01

    Experience in using packaged numerical software for solving two related problems that arise in Plasma physics is described. These problems are (i) the solution of the reduced resistive MHD equations and (ii) the solution of the Grad-Shafranov equation

  19. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    International Nuclear Information System (INIS)

    Hernandez, F.; Gonzalez-Manrique, S.; Karlsson, L.; Hernandez-Armas, J.; Aparicio, A.

    2007-01-01

    Makrofol detectors are commonly used for long-term radon ( 222 Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm -3 htrack -1 cm 2 , has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm -3 and, in two of them, above 400Bqm -3 . Further studies should be performed at those schools following the European Union recommendations about radon concentrations in buildings

  20. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    Science.gov (United States)

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  1. Waste package performance allocation system study report

    International Nuclear Information System (INIS)

    Memory, R.D.

    1994-01-01

    The Waste Package Performance Allocation system study was performed in order to provide a technical basis for the selection of the waste package period of substantially complete containment and its resultant contribution to the overall total system performance. This study began with a reference case based on the current Mined Geologic Disposal System (MGDS) baseline design and added a number of alternative designs. The waste package designs were selected from the designs being considered in detail during Advanced Conceptual Design (ACD). The waste packages considered were multi-barrier packages with a 0.95 cm Alloy 825 inner barrier and a 10, 20, or 45 cm thick carbon steel outer barrier. The waste package capacities varied from 6 to 12 to 21 Pressurized Water Reactor (PWR) fuel assemblies. The vertical borehole and in-drift emplacement modes were also considered, as were thermal loadings of 25, 57, and 114 kW/acre. The repository cost analysis indicated that the 21 PWR in-drift emplacement mode option with the 10 cm and 20 cm outer barrier thicknesses are the least expensive and that the 12 PWR in-drift case has approximately the same cost as the 6 PWR vertical borehole. It was also found that the cost increase from the 10 cm outer barrier waste package to the 20 cm waste package was less per centimeter than the increase from the 20 cm outer barrier waste package to the 45 cm outer barrier waste package. However, the repository cost was nearly linear with the outer barrier thickness for the 21 PWR in-drift case. Finally, corrosion rate estimates are provided and the relationship of repository cost versus waste package lifetime is discussed as is cumulative radionuclide release from the waste package and to the accessible environment for time periods of 10,000 years and 100,000 years

  2. Development of New Low-Cost, High-Performance, PV Module Encapsulant/Packaging Materials: Final Technical Progress Report, 22 October 2002 - 15 November 2007

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, R.

    2008-04-01

    Report on objectives to work with U.S.-based PV module manufacturers (c-Si, a-Si, CIS, other thin films) to develop/qualify new low-cost, high-performance PV module encapsulant/packaging materials, and processes using the packaging materials.

  3. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  4. Comparison of computed tomography dose reporting software

    International Nuclear Information System (INIS)

    Abdullah, A.; Sun, Z.; Pongnapang, N.; Ng, K. H.

    2008-01-01

    Computed tomography (CT) dose reporting software facilitates the estimation of doses to patients undergoing CT examinations. In this study, comparison of three software packages, i.e. CT-Expo (version 1.5, Medizinische Hochschule, Hannover (Germany)), ImPACT CT Patients Dosimetry Calculator (version 0.99x, Imaging Performance Assessment on Computed Tomography, www.impactscan.org) and WinDose (version 2.1a, Wellhofer Dosimetry, Schwarzenbruck (Germany)), has been made in terms of their calculation algorithm and the results of calculated doses. Estimations were performed for head, chest, abdominal and pelvic examinations based on the protocols recommended by European guidelines using single-slice CT (SSCT) (Siemens Somatom Plus 4, Erlangen (Germany)) and multi-slice CT (MSCT) (Siemens Sensation 16, Erlangen (Germany)) for software-based female and male phantoms. The results showed that there are some differences in final dose reporting provided by these software packages. There are deviations of effective doses produced by these software packages. Percentages of coefficient of variance range from 3.3 to 23.4 % in SSCT and from 10.6 to 43.8 % in MSCT. It is important that researchers state the name of the software that is used to estimate the various CT dose quantities. Users must also understand the equivalent terminologies between the information obtained from the CT console and the software packages in order to use the software correctly. (authors)

  5. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    Science.gov (United States)

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  6. The software quality control for gamma spectrometry

    International Nuclear Information System (INIS)

    Monte, L.

    1986-01-01

    One of major problems with wich the quality control program of an environmental measurements laboratory is confronted is the evaluation of the performances of software packages for the analysis of gamma-ray spectra. A program of tests for evaluating the performances of the software package (SPECTRAN-F, Canberra Inc.) used by our laboratory is being carried out. In this first paper the results of a preliminary study concerning the evaluation of the performance of the doublet analysis routine are presented

  7. System software design for the CDF Silicon Vertex Detector

    Energy Technology Data Exchange (ETDEWEB)

    Tkaczyk, S. (Fermi National Accelerator Lab., Batavia, IL (United States)); Bailey, M. (Purdue Univ., Lafayette, IN (United States))

    1991-11-01

    An automated system for testing and performance evaluation of the CDF Silicon Vertex Detector (SVX) data acquisition electronics is described. The SVX data acquisition chain includes the Fastbus Sequencer and the Rabbit Crate Controller and Digitizers. The Sequencer is a programmable device for which we developed a high level assembly language. Diagnostic, calibration and data acquisition programs have been developed. A distributed software package was developed in order to operate the modules. The package includes programs written in assembly and Fortran languages that are executed concurrently on the SVX Sequencer modules and either a microvax or an SSP. Test software was included to assist technical personnel during the production and maintenance of the modules. Details of the design of different components of the package are reported.

  8. System software design for the CDF Silicon Vertex Detector

    International Nuclear Information System (INIS)

    Tkaczyk, S.; Bailey, M.

    1991-11-01

    An automated system for testing and performance evaluation of the CDF Silicon Vertex Detector (SVX) data acquisition electronics is described. The SVX data acquisition chain includes the Fastbus Sequencer and the Rabbit Crate Controller and Digitizers. The Sequencer is a programmable device for which we developed a high level assembly language. Diagnostic, calibration and data acquisition programs have been developed. A distributed software package was developed in order to operate the modules. The package includes programs written in assembly and Fortran languages that are executed concurrently on the SVX Sequencer modules and either a microvax or an SSP. Test software was included to assist technical personnel during the production and maintenance of the modules. Details of the design of different components of the package are reported

  9. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    International Nuclear Information System (INIS)

    Tso, C.F.; Hueggenberg, R.

    2004-01-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work

  10. Evaluation of finite element codes for demonstrating the performance of radioactive material packages in hypothetical accident drop scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Tso, C.F. [Arup (United Kingdom); Hueggenberg, R. [Gesellschaft fuer Nuklear-Behaelter mbH (Germany)

    2004-07-01

    Drop testing and analysis are the two methods for demonstrating the performance of packages in hypothetical drop accident scenarios. The exact purpose of the tests and the analyses, and the relative prominence of the two in the license application, may depend on the Competent Authority and will vary between countries. The Finite Element Method (FEM) is a powerful analysis tool. A reliable finite element (FE) code when used correctly and appropriately, will allow a package's behaviour to be simulated reliably. With improvements in computing power, and in sophistication and reliability of FE codes, it is likely that FEM calculations will increasingly be used as evidence of drop test performance when seeking Competent Authority approval. What is lacking at the moment, however, is a standardised method of assessing a FE code in order to determine whether it is sufficiently reliable or pessimistic. To this end, the project Evaluation of Codes for Analysing the Drop Test Performance of Radioactive Material Transport Containers, funded by the European Commission Directorate-General XVII (now Directorate-General for Energy and Transport) and jointly performed by Arup and Gesellschaft fuer Nuklear-Behaelter mbH, was carried out in 1998. The work consisted of three components: Survey of existing finite element software, with a view to finding codes that may be capable of analysing drop test performance of radioactive material packages, and to produce an inventory of them. Develop a set of benchmark problems to evaluate software used for analysing the drop test performance of packages. Evaluate the finite element codes by testing them against the benchmarks This paper presents a summary of this work.

  11. FRAMES Software System: Linking to the Statistical Package R

    Energy Technology Data Exchange (ETDEWEB)

    Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.

    2006-12-11

    This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.

  12. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  13. Assessment of microelectronics packaging for high temperature, high reliability applications

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, F.

    1997-04-01

    This report details characterization and development activities in electronic packaging for high temperature applications. This project was conducted through a Department of Energy sponsored Cooperative Research and Development Agreement between Sandia National Laboratories and General Motors. Even though the target application of this collaborative effort is an automotive electronic throttle control system which would be located in the engine compartment, results of this work are directly applicable to Sandia`s national security mission. The component count associated with the throttle control dictates the use of high density packaging not offered by conventional surface mount. An enabling packaging technology was selected and thermal models defined which characterized the thermal and mechanical response of the throttle control module. These models were used to optimize thick film multichip module design, characterize the thermal signatures of the electronic components inside the module, and to determine the temperature field and resulting thermal stresses under conditions that may be encountered during the operational life of the throttle control module. Because the need to use unpackaged devices limits the level of testing that can be performed either at the wafer level or as individual dice, an approach to assure a high level of reliability of the unpackaged components was formulated. Component assembly and interconnect technologies were also evaluated and characterized for high temperature applications. Electrical, mechanical and chemical characterizations of enabling die and component attach technologies were performed. Additionally, studies were conducted to assess the performance and reliability of gold and aluminum wire bonding to thick film conductor inks. Kinetic models were developed and validated to estimate wire bond reliability.

  14. Telescoping Solar Array Concept for Achieving High Packaging Efficiency

    Science.gov (United States)

    Mikulas, Martin; Pappa, Richard; Warren, Jay; Rose, Geoff

    2015-01-01

    Lightweight, high-efficiency solar arrays are required for future deep space missions using high-power Solar Electric Propulsion (SEP). Structural performance metrics for state-of-the art 30-50 kW flexible blanket arrays recently demonstrated in ground tests are approximately 40 kW/cu m packaging efficiency, 150 W/kg specific power, 0.1 Hz deployed stiffness, and 0.2 g deployed strength. Much larger arrays with up to a megawatt or more of power and improved packaging and specific power are of interest to mission planners for minimizing launch and life cycle costs of Mars exploration. A new concept referred to as the Compact Telescoping Array (CTA) with 60 kW/cu m packaging efficiency at 1 MW of power is described herein. Performance metrics as a function of array size and corresponding power level are derived analytically and validated by finite element analysis. Feasible CTA packaging and deployment approaches are also described. The CTA was developed, in part, to serve as a NASA reference solar array concept against which other proposed designs of 50-1000 kW arrays for future high-power SEP missions could be compared.

  15. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.; Alshayeb, M.; Mahmoud, S. A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence

  16. User’s Manual for the Simulation of Energy Consumption and Emissions from Rail Traffic Software Package

    DEFF Research Database (Denmark)

    Cordiero, Tiago M.; Lindgreen, Erik Bjørn Grønning; Sorenson, Spencer C

    2005-01-01

    The ARTEMIS rail emissions model was implemented in a Microsoft Excel software package that includes data from the GISCO database on railway traffic. This report is the user’s manual for the aforementioned software that includes information on how to run the program and an overview on how...... of Excel Macros (Visual Basic) and database sheets included in one Excel file...

  17. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    Science.gov (United States)

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  18. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Dinter, T.; Rozanov, A.V.; Wolanin, A.; Bracher, A.; Burrows, J.P.

    2017-01-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean–atmosphere radiative transfer solver presented by Rozanov et al. we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: (http://www.iup.physik.uni-bremen.de). - Highlights: • A new version of the software package SCIATRAN is presented. • Inelastic scattering in water and atmosphere is implemented in SCIATRAN. • Raman scattering and fluorescence can be included in radiative transfer calculations. • Comparisons to other radiative transfer models show excellent agreement. • Comparisons to observations show consistent results.

  19. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  20. Integrated software package for nuclear material safeguards in a MOX fuel fabrication facility

    International Nuclear Information System (INIS)

    Schreiber, H.J.; Piana, M.; Moussalli, G.; Saukkonen, H.

    2000-01-01

    Since computerized data processing was introduced to Safeguards at large bulk handling facilities, a large number of individual software applications have been developed for nuclear material Safeguards implementation. Facility inventory and flow data are provided in computerized format for performing stratification, sample size calculation and selection of samples for destructive and non-destructive assay. Data is collected from nuclear measurement systems running in attended, unattended mode and more recently from remote monitoring systems controlled. Data sets from various sources have to be evaluated for Safeguards purposes, such as raw data, processed data and conclusions drawn from data evaluation results. They are reported in computerized format at the International Atomic Energy Agency headquarters and feedback from the Agency's mainframe computer system is used to prepare and support Safeguards inspection activities. The integration of all such data originating from various sources cannot be ensured without the existence of a common data format and a database system. This paper describes the fundamental relations between data streams, individual data processing tools, data evaluation results and requirements for an integrated software solution to facilitate nuclear material Safeguards at a bulk handling facility. The paper also explains the basis for designing a software package to manage data streams from various data sources and for incorporating diverse data processing tools that until now have been used independently from each other and under different computer operating systems. (author)

  1. Determination of stress-strain state of the wooden church log walls with software package

    Directory of Open Access Journals (Sweden)

    Chulkova Anastasia

    2016-01-01

    Full Text Available The restoration of architectural monuments is going on all over the world today. The main aim of restoration is the renewal of stable functioning of building constructions in normal state. In this article, we have tried to figure out with special software the bearing capacity of log cabins of the Church of Transfiguration on Kizhi island. As shown in research results, determination of stress-strain stage with software package is necessary for the bearing capacity computation as well as field tests.

  2. Thermal analysis of Yucca Mountain commercial high-level waste packages

    International Nuclear Information System (INIS)

    Altenhofen, M.K.; Eslinger, P.W.

    1992-10-01

    The thermal performance of commercial high-level waste packages was evaluated on a preliminary basis for the candidate Yucca Mountain repository site. The purpose of this study is to provide an estimate for waste package component temperatures as a function of isolation time in tuff. Several recommendations are made concerning the additional information and modeling needed to evaluate the thermal performance of the Yucca Mountain repository system

  3. The EQ3/6 software package for geochemical modeling: Current status

    International Nuclear Information System (INIS)

    Worlery, T.J.; Jackson, K.J.; Bourcier, W.L.; Bruton, C.J.; Viani, B.E.; Knauss, K.G.; Delany, J.M.

    1988-07-01

    EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300 degree C. 60 refs., 2 figs

  4. The EQ3/6 software package for geochemical modeling: Current status

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.; Jackson, K.J.; Bourcier, W.L.; Bruton, C.J.; Viani, B.E.; Knauss, K.G.; Delany, J.M.

    1988-07-01

    EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300{degree}C. 60 refs., 2 figs.

  5. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Energy Technology Data Exchange (ETDEWEB)

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  6. ldr: An R Software Package for Likelihood-Based Su?cient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    Kofi Placid Adragni

    2014-11-01

    Full Text Available In regression settings, a su?cient dimension reduction (SDR method seeks the core information in a p-vector predictor that completely captures its relationship with a response. The reduced predictor may reside in a lower dimension d < p, improving ability to visualize data and predict future observations, and mitigating dimensionality issues when carrying out further analysis. We introduce ldr, a new R software package that implements three recently proposed likelihood-based methods for SDR: covariance reduction, likelihood acquired directions, and principal fitted components. All three methods reduce the dimensionality of the data by pro jection into lower dimensional subspaces. The package also implements a variable screening method built upon principal ?tted components which makes use of ?exible basis functions to capture the dependencies between the predictors and the response. Examples are given to demonstrate likelihood-based SDR analyses using ldr, including estimation of the dimension of reduction subspaces and selection of basis functions. The ldr package provides a framework that we hope to grow into a comprehensive library of likelihood-based SDR methodologies.

  7. An Object-Oriented Serial DSMC Simulation Package

    Science.gov (United States)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  8. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    Science.gov (United States)

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  9. Deployment of the CMS software on the WLCG Grid

    International Nuclear Information System (INIS)

    Behrenhoff, W; Wissing, C; Kim, B; Blyweert, S; D'Hondt, J; Maes, J; Maes, M; Mulders, P Van; Villella, I; Vanelderen, L

    2011-01-01

    The CMS Experiment is taking high energy collision data at CERN. The computing infrastructure used to analyse the data is distributed round the world in a tiered structure. In order to use the 7 Tier-1 sites, the 50 Tier-2 sites and a still growing number of about 30 Tier-3 sites, the CMS software has to be available at those sites. Except for a very few sites the deployment and the removal of CMS software is managed centrally. Since the deployment team has no local accounts at the remote sites all installation jobs have to be sent via Grid jobs. Via a VOMS role the job has a high priority in the batch system and gains write privileges to the software area. Due to the lack of interactive access the installation jobs must be very robust against possible failures, in order not to leave a broken software installation. The CMS software is packaged in RPMs that are installed in the software area independent of the host OS. The apt-get tool is used to resolve package dependencies. This paper reports about the recent deployment experiences and the achieved performance.

  10. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model

    International Nuclear Information System (INIS)

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-01-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  11. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  12. Software Packages to Support Electrical Engineering Virtual Lab

    Directory of Open Access Journals (Sweden)

    Manuel Travassos Valdez

    2012-03-01

    Full Text Available The use of Virtual Reality Systems (VRS, as a learning aid, encourages the creation of tools that allow users/students to simulate educational environments on a computer. This article presents a way of building a VRS system with Software Packages to support Electrical Engineering Virtual Laboratories to be used in a near future in the teaching of the curriculum unit of Circuit Theory. The steps required for the construction of a project are presented in this paper. The simulation is still under construction and intends to use a three-dimensional virtual environment laboratory electric measurement, which will allow users/students to experiment and test the modeled equipment. Therefore, there are still no links available for further examination. The result may demonstrate the future potential of applications of Virtual Reality Systems as an efficient and cost-effective learning system.

  13. Advanced Modular Software Performance Monitoring

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. The LHCb experiment is now in the active phase of collecting and analyzing data and significant performance problems arise in the Gaudi based software beginning from High Level Trigger (HLT) programs and ending with data analysis frameworks (DaVinci). It’s not easy to find hot spots in the code - only special tools can help to understand where CPU or memory usage is not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling to...

  14. Advanced modular software performance monitoring

    CERN Document Server

    Mazurov, A

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. As the LHCb experiment is now in the active phase of collecting and analyzing data, performance problems arise in various parts of the software, from the High Level Trigger (HLT) programs to data analysis frameworks. It is not easy to find hotspots in the code - only specialized tools can help to understand where CPU or memory usage are not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling tools (based on Intel VTune Amplif...

  15. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools.

    Science.gov (United States)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier

    2017-11-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2  = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2  = 0.4, S e  = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Directory of Open Access Journals (Sweden)

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  17. Application of systems engineering to determine performance requirements for repository waste packages

    International Nuclear Information System (INIS)

    Aitken, E.A.; Stimmell, G.L.

    1987-01-01

    The waste package for a nuclear waste repository in salt must contribute substantially to the performance objectives defined by the Salt Repository Project (SRP) general requirements document governing disposal of high-level waste. The waste package is one of the engineered barriers providing containment. In establishing the performance requirements for a project focused on design and fabrication of the waste package, the systems engineering methodology has been used to translate the hierarchy requirements for the repository system to specific performance requirements for design and fabrication of the waste package, a subsystem of the repository. This activity is ongoing and requires a methodology that provides traceability and is capable of iteration as baseline requirements are refined or changed. The purpose of this summary is to describe the methodology being used and the way it can be applied to similar activities in the nuclear industry

  18. AMMOS: A Software Platform to Assist in silico Screening

    Directory of Open Access Journals (Sweden)

    Lagorce D.

    2009-12-01

    Full Text Available Three software packages based on the common platform of AMMOS (Automated Molecular Mechanics Optimization tool for in silico Screening for assisting virtual ligand screening purposes have been recently developed. DG-AMMOS allows generation of 3D conformations of small molecules using distance geometry and molecular mechanics optimization. AMMOS_SmallMol is a package for structural refinement of compound collections that can be used prior to docking experiments. AMMOS_ProtLig is a package for energy minimization of protein-ligand complexes. It performs an automatic procedure for molecular mechanics minimization at different levels of flexibility - from rigid to fully flexible structures of both the ligand and the receptor. The packages have been tested on small molecules with a high structural diversity and proteins binding sites of completely different geometries and physicochemical properties. The platform is developed as an open source software and can be used in a broad range of in silico drug design studies.

  19. EQ3/6, a software package for geochemical modeling of aqueous systems: Package overview and installation guide (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.

    1992-01-01

    EQ3/6 is a software package for geochemical modeling of aqueous systems. This report describes version 7.0. The major components of the package include: EQ3NR, a speciation-solubility code; EQ6, a reaction path code which models water/rock interaction or fluid mixing in either a pure reaction progress mode or a time mode; EQPT, a data file preprocessor, EQLIB, a supporting software library; and five supporting thermodynamic data files. The software deals with the concepts of thermodynamic equilibrium, thermodynamic disequilibrium, and reaction kinetics. The five supporting data files contain both standard state and activity coefficient-related data. Three support the use of the Davies or B equations for the activity coefficients; the other two support the use of Pitzer's equations. The temperature range of the thermodynamic data on the data files varies from 25 degree C only to 0--300 degree C. EQPT takes a formatted data file (a data0 file) and writes an unformatted near-equivalent called a datal file, which is actually the form read by EQ3NR and EQ6. EQ3NR is useful for analyzing groundwater chemistry data, calculating solubility limits, and determining whether certain reactions are in states of partial equilibrium or disequilibrium. It is also required to initialize an EQ6 calculation. EQ6 models the consequences of reacting an aqueous solution with a set of reactants which react irreversibly. It can also model fluid mixing and the consequences of changes in temperature. This code operates both in a pure reaction progress frame and in a time frame

  20. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Science.gov (United States)

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  1. Main real time software for high-energy physics experiments

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1985-01-01

    The general problems of organization of software complexes, as well as development of typical algorithms and packages of applied programs for real time systems used in experiments with charged particle accelerators are discussed. It is noted that numerous qualitatively different real time tasks are solved by parallel programming of the processes of data acquisition, equipment control, data exchange with remote terminals, data express processing and accumulation, operator's instruction interpretation, generation and buffering of resulting files for data output and information processing which is realized on the basis of multicomputer system utilization. Further development of software for experiments is associated with improving the algorithms for automatic recognition and analysis of events with complex topology and standardization of applied program packages

  2. PHYLUCE is a software package for the analysis of conserved genomic loci.

    Science.gov (United States)

    Faircloth, Brant C

    2016-03-01

    Targeted enrichment of conserved and ultraconserved genomic elements allows universal collection of phylogenomic data from hundreds of species at multiple time scales ( 300 Ma). Prior to downstream inference, data from these types of targeted enrichment studies must undergo preprocessing to assemble contigs from sequence data; identify targeted, enriched loci from the off-target background data; align enriched contigs representing conserved loci to one another; and prepare and manipulate these alignments for subsequent phylogenomic inference. PHYLUCE is an efficient and easy-to-install software package that accomplishes these tasks across hundreds of taxa and thousands of enriched loci. PHYLUCE is written for Python 2.7. PHYLUCE is supported on OSX and Linux (RedHat/CentOS) operating systems. PHYLUCE source code is distributed under a BSD-style license from https://www.github.com/faircloth-lab/phyluce/ PHYLUCE is also available as a package (https://binstar.org/faircloth-lab/phyluce) for the Anaconda Python distribution that installs all dependencies, and users can request a PHYLUCE instance on iPlant Atmosphere (tag: phyluce). The software manual and a tutorial are available from http://phyluce.readthedocs.org/en/latest/ and test data are available from doi: 10.6084/m9.figshare.1284521. brant@faircloth-lab.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  4. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools

    DEFF Research Database (Denmark)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei

    2017-01-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioriti...... to uncertainty and dramatically decreased model performance (R2 = 0.4, Se = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches....

  5. Packaging Technologies for High Temperature Electronics and Sensors

    Science.gov (United States)

    Chen, Liangyu; Hunter, Gary W.; Neudeck, Philip G.; Beheim, Glenn M.; Spry, David J.; Meredith, Roger D.

    2013-01-01

    This paper reviews ceramic substrates and thick-film metallization based packaging technologies in development for 500degC silicon carbide (SiC) electronics and sensors. Prototype high temperature ceramic chip-level packages and printed circuit boards (PCBs) based on ceramic substrates of aluminum oxide (Al2O3) and aluminum nitride (AlN) have been designed and fabricated. These ceramic substrate-based chiplevel packages with gold (Au) thick-film metallization have been electrically characterized at temperatures up to 550degC. A 96% alumina based edge connector for a PCB level subsystem interconnection has also been demonstrated recently. The 96% alumina packaging system composed of chip-level packages and PCBs has been tested with high temperature SiC devices at 500degC for over 10,000 hours. In addition to tests in a laboratory environment, a SiC JFET with a packaging system composed of a 96% alumina chip-level package and an alumina printed circuit board mounted on a data acquisition circuit board was launched as a part of the MISSE-7 suite to the International Space Station via a Shuttle mission. This packaged SiC transistor was successfully tested in orbit for eighteen months. A spark-plug type sensor package designed for high temperature SiC capacitive pressure sensors was developed. This sensor package combines the high temperature interconnection system with a commercial high temperature high pressure stainless steel seal gland (electrical feed-through). Test results of a packaged high temperature capacitive pressure sensor at 500degC are also discussed. In addition to the pressure sensor package, efforts for packaging high temperature SiC diode-based gas chemical sensors are in process.

  6. Savannah River Plant Californium-252 Shuffler software manual

    International Nuclear Information System (INIS)

    Johnson, S.S.; Crane, T.W.; Eccleston, G.W.

    1979-03-01

    A software manual for operating the Savannah River Plant Shuffler nondestructive assay instrument is presented. The procedures for starting up the instrument, making assays, calibrating, and checking the performance of the hardware units are described. A list of the error messages with an explanation of the circumstances prompting the message and possible corrective measures is given. A summary of the software package is included showing the names and contents of the files and subroutines. The procedure for modifying the software package is outlined

  7. Draft Technical Position Subtask 1.1: waste package performance after repository closure. Volume 1

    International Nuclear Information System (INIS)

    Davis, M.S.; Schweitzer, D.G.

    1983-08-01

    This document provides guidance to the DOE on the issues and information necessary for the NRC to evaluate waste package performance after repository closure. Minimal performance objectives of the waste package are required by proposed 10 CFR 60. This Draft Technical Position describes the various options available to the DOE for compliance and discusses advantages and disadvantages of various choices. Examples are discussed dealing with demonstrability, predictability and reasonable assurance. The types of performance are considered. The document summarizes presently identified high priority issues needed to evaluate waste package performance after repository closure. 20 references, 7 tables

  8. Implementation of the INSPECT software package for statistical calculation in nuclear material accountability

    International Nuclear Information System (INIS)

    Marzo, M.A.S.

    1986-01-01

    The INSPECT software package was developed in the Pacific Northwest Laboratory for statistical calculations in nuclear material accountability. The programs apply the inspection and evaluation methodology described in Part of the Safeguards Technical Manual. In this paper the implementation of INSPECT at the Safeguards Division of CNEN, and the main characteristics of INSPECT are described. The potential applications of INSPECT to the nuclear material accountability is presented. (Author) [pt

  9. Documentation package for the RFID temperature monitoring system (Model 9977 packages at NTS)

    International Nuclear Information System (INIS)

    Chen, K.; Tsai, H.

    2009-01-01

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it can be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The

  10. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  11. The Future of Software Engineering for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Pope, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-16

    DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts of the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.

  12. Calculation of chemical equilibrium between aqueous solution and minerals: the EQ3/6 software package

    International Nuclear Information System (INIS)

    Wolery, T.J.

    1979-01-01

    The newly developed EQ/36 software package computes equilibrium models of aqueous geochemical systems. The package contains two principal programs: EQ3 performs distribution-of-species calculations for natural water compositions; EQ6 uses the results of EQ3 to predict the consequences of heating and cooling aqueous solutions and of irreversible reaction in rock--water systems. The programs are valuable for studying such phenomena as the formation of ore bodies, scaling and plugging in geothermal development, and the long-term disposal of nuclear waste. EQ3 and EQ6 are compared with such well-known geochemical codes as SOLMNEQ, WATEQ, REDEQL, MINEQL, and PATHI. The data base allows calculations in the temperature interval 0 to 350 0 C, at either 1 atm-steam saturation pressures or a constant 500 bars. The activity coefficient approximations for aqueous solutes limit modeling to solutions of ionic strength less than about one molal. The mathematical derivations and numerical techniques used in EQ6 are presented in detail. The program uses the Newton--Raphson method to solve the governing equations of chemical equilibrium for a system of specified elemental composition at fixed temperature and pressure. Convergence is aided by optimizing starting estimates and by under-relaxation techniques. The minerals present in the stable phase assemblage are found by several empirical methods. Reaction path models may be generated by using this approach in conjunction with finite differences. This method is analogous to applying high-order predictor--corrector methods to integrate a corresponding set of ordinary differential equations, but avoids propagation of error (drift). 8 figures, 9 tables

  13. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  14. Safety analysis report for packaging (onsite) transuranic performance demonstration program sample packaging

    International Nuclear Information System (INIS)

    Mccoy, J.C.

    1997-01-01

    The Transuranic Performance Demonstration Program (TPDP) sample packaging is used to transport highway route controlled quantities of weapons grade (WG) plutonium samples from the Plutonium Finishing Plant (PFP) to the Waste Receiving and Processing (WRAP) facility and back. The purpose of these shipments is to test the nondestructive assay equipment in the WRAP facility as part of the Nondestructive Waste Assay PDP. The PDP is part of the U. S. Department of Energy (DOE) National TRU Program managed by the U. S. Department of Energy, Carlsbad Area Office, Carlsbad, New Mexico. Details of this program are found in CAO-94-1045, Performance Demonstration Program Plan for Nondestructive Assay for the TRU Waste Characterization Program (CAO 1994); INEL-96/0129, Design of Benign Matrix Drums for the Non-Destructive Assay Performance Demonstration Program for the National TRU Program (INEL 1996a); and INEL-96/0245, Design of Phase 1 Radioactive Working Reference Materials for the Nondestructive Assay Performance Demonstration Program for the National TRU Program (INEL 1996b). Other program documentation is maintained by the national TRU program and each DOE site participating in the program. This safety analysis report for packaging (SARP) provides the analyses and evaluations necessary to demonstrate that the TRU PDP sample packaging meets the onsite transportation safety requirements of WHC-CM-2-14, Hazardous Material Packaging and Shipping, for an onsite Transportation Hazard Indicator (THI) 2 packaging. This SARP, however, does not include evaluation of any operations within the PFP or WRAP facilities, including handling, maintenance, storage, or operating requirements, except as they apply directly to transportation between the gate of PFP and the gate of the WRAP facility. All other activities are subject to the requirements of the facility safety analysis reports (FSAR) of the PFP or WRAP facility and requirements of the PDP

  15. pyLIMA: An Open-source Package for Microlensing Modeling. I. Presentation of the Software and Analysis of Single-lens Models

    Science.gov (United States)

    Bachelet, E.; Norbury, M.; Bozza, V.; Street, R.

    2017-11-01

    Microlensing is a unique tool, capable of detecting the “cold” planets between ˜1 and 10 au from their host stars and even unbound “free-floating” planets. This regime has been poorly sampled to date owing to the limitations of alternative planet-finding methods, but a watershed in discoveries is anticipated in the near future thanks to the planned microlensing surveys of WFIRST-AFTA and Euclid's Extended Mission. Of the many challenges inherent in these missions, the modeling of microlensing events will be of primary importance, yet it is often time-consuming, complex, and perceived as a daunting barrier to participation in the field. The large scale of future survey data products will require thorough but efficient modeling software, but, unlike other areas of exoplanet research, microlensing currently lacks a publicly available, well-documented package to conduct this type of analysis. We present version 1.0 of the python Lightcurve Identification and Microlensing Analysis (pyLIMA). This software is written in Python and uses existing packages as much as possible to make it widely accessible. In this paper, we describe the overall architecture of the software and the core modules for modeling single-lens events. To verify the performance of this software, we use it to model both real data sets from events published in the literature and generated test data produced using pyLIMA's simulation module. The results demonstrate that pyLIMA is an efficient tool for microlensing modeling. We will expand pyLIMA to consider more complex phenomena in the following papers.

  16. EQ3/6, a software package for geochemical modeling of aqueous systems: Package overview and installation guide (Version 7.0)

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.

    1992-09-14

    EQ3/6 is a software package for geochemical modeling of aqueous systems. This report describes version 7.0. The major components of the package include: EQ3NR, a speciation-solubility code; EQ6, a reaction path code which models water/rock interaction or fluid mixing in either a pure reaction progress mode or a time mode; EQPT, a data file preprocessor, EQLIB, a supporting software library; and five supporting thermodynamic data files. The software deals with the concepts of thermodynamic equilibrium, thermodynamic disequilibrium, and reaction kinetics. The five supporting data files contain both standard state and activity coefficient-related data. Three support the use of the Davies or B-dot equations for the activity coefficients; the other two support the use of Pitzer`s equations. The temperature range of the thermodynamic data on the data files varies from 25{degree}C only to 0--300{degree}C. EQPT takes a formatted data file (a data0 file) and writes an unformatted near-equivalent called a datal file, which is actually the form read by EQ3NR and EQ6. EQ3NR is useful for analyzing groundwater chemistry data, calculating solubility limits, and determining whether certain reactions are in states of partial equilibrium or disequilibrium. It is also required to initialize an EQ6 calculation. EQ6 models the consequences of reacting an aqueous solution with a set of reactants which react irreversibly. It can also model fluid mixing and the consequences of changes in temperature. This code operates both in a pure reaction progress frame and in a time frame.

  17. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  18. An open-source software package for multivariate modeling and clustering: applications to air quality management.

    Science.gov (United States)

    Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong

    2015-09-01

    This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.

  19. Antecedents and Moderators of Software Professionals’ Performance

    Directory of Open Access Journals (Sweden)

    Shiva Prasad H. C.

    2014-02-01

    Full Text Available Software professionals’ (SPs' performance is often understood narrowly in terms of input–output productivity. This study approaches performance from a broader perspective and examines whether the emotional intelligence competencies (EICs of SPs, the leadership style of team leaders, social capital among team members, and human resource management (HRM practices of software firms affect performance of SPs. It also tests whether the value of and opportunities for knowledge sharing moderate such relationships. Data were collected from 441 Indian SPs in a questionnaire survey. Fifty-five team leaders assessed the performance of SPs, and SPs assessed the other constructs. Results revealed that EICs, transformational leadership style, social capital, and HRM practices positively affect performance. EICs are the most important predictors of performance. Under high (low value of and high (low opportunities for knowledge sharing, the antecedents influencing performance are strengthened (attenuated or nullified. The value of and opportunities for knowledge sharing are quasi-moderators. These findings have significant implications for organizing effective work teams.

  20. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    International Nuclear Information System (INIS)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  1. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  2. Comparative evaluations of the Monte Carlo-based light propagation simulation packages for optical imaging

    Directory of Open Access Journals (Sweden)

    Lin Wang

    2018-01-01

    Full Text Available Monte Carlo simulation of light propagation in turbid medium has been studied for years. A number of software packages have been developed to handle with such issue. However, it is hard to compare these simulation packages, especially for tissues with complex heterogeneous structures. Here, we first designed a group of mesh datasets generated by Iso2Mesh software, and used them to cross-validate the accuracy and to evaluate the performance of four Monte Carlo-based simulation packages, including Monte Carlo model of steady-state light transport in multi-layered tissues (MCML, tetrahedron-based inhomogeneous Monte Carlo optical simulator (TIMOS, Molecular Optical Simulation Environment (MOSE, and Mesh-based Monte Carlo (MMC. The performance of each package was evaluated based on the designed mesh datasets. The merits and demerits of each package were also discussed. Comparative results showed that the TIMOS package provided the best performance, which proved to be a reliable, efficient, and stable MC simulation package for users.

  3. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  4. GPS Software Packages Deliver Positioning Solutions

    Science.gov (United States)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  5. WGCNA: an R package for weighted correlation network analysis.

    Science.gov (United States)

    Langfelder, Peter; Horvath, Steve

    2008-12-29

    Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.

  6. Performance-oriented packagings for hazardous materials: Resource guide

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    This document provides recommendations to US Department of Energy (DOE) shippers regarding packaging that meet performance-oriented packaging requirements implemented by US Department of Transportation (DOT) in rulemaking HM-181 (December 21, 1990) and subsequent actions. The packaging described in this document are certified by their vendor to comply with requirements for Packing Group I, II, or III hazardous materials packaging. The intent of this document is to share information between DOE and contractors and at all DOE facilities.

  7. Performance-oriented packagings for hazardous materials: Resource guide

    International Nuclear Information System (INIS)

    1993-09-01

    This document provides recommendations to US Department of Energy (DOE) shippers regarding packaging that meet performance-oriented packaging requirements implemented by US Department of Transportation (DOT) in rulemaking HM-181 (December 21, 1990) and subsequent actions. The packaging described in this document are certified by their vendor to comply with requirements for Packing Group I, II, or III hazardous materials packaging. The intent of this document is to share information between DOE and contractors and at all DOE facilities

  8. ATLAS software configuration and build tool optimisation

    Science.gov (United States)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  9. Integrated software system for low level waste management

    International Nuclear Information System (INIS)

    Worku, G.

    1995-01-01

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal under the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications

  10. Compensation packages: a strategic tool for employees’ performance and retention

    Directory of Open Access Journals (Sweden)

    Omotayo Adewale OSIBANJO

    2014-02-01

    Full Text Available The rate at which employees in private universities in Nigeria jump from one university to the other is becoming more disturbing and this could be as a result of compensation packages of different universities to attract competent employees. The aim of this study was to examine the effect of compensation packages on employees’ job performance and retention in a selected private University in Ogun State, South-West Nigeria. A model was developed and tested using one hundred and eleven valid questionnaires which were completed by academics and non-academic staff of the university. The collected data were carefully analyzed using simple percentage supported by structural equation modelling to test the hypotheses and relationships that may exist among the variables under consideration. The results showed strong relationship between compensation packages and employees’ performance and retention. The summary of the findings indicates that there is strong correlation between the tested dependent and independent variables (salary, bonus, incentives, allowances, and fringe benefits. However, management and decision makers should endeavour to review compensation packages at various levels in order to earn employees’ satisfaction and prevention of high labour turnover among the members of staff.

  11. Performance and Reliability of Bonded Interfaces for High-Temperature Packaging (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Devoto, D.

    2014-11-01

    The thermal performance and reliability of sintered-silver is being evaluated for power electronics packaging applications. This will be experimentally accomplished by the synthesis of large-area bonded interfaces between metalized substrates that will be subsequently subjected to thermal cycles. A finite element model of crack initiation and propagation in these bonded interfaces will allow for the interpretation of degradation rates by a crack-velocity (V)-stress intensity factor (K) analysis. The experiment is outlined, and the modeling approach is discussed.

  12. Image analysis software versus direct anthropometry for breast measurements.

    Science.gov (United States)

    Quieregatto, Paulo Rogério; Hochman, Bernardo; Furtado, Fabianne; Machado, Aline Fernanda Perez; Sabino Neto, Miguel; Ferreira, Lydia Masako

    2014-10-01

    To compare breast measurements performed using the software packages ImageTool(r), AutoCAD(r) and Adobe Photoshop(r) with direct anthropometric measurements. Points were marked on the breasts and arms of 40 volunteer women aged between 18 and 60 years. When connecting the points, seven linear segments and one angular measurement on each half of the body, and one medial segment common to both body halves were defined. The volunteers were photographed in a standardized manner. Photogrammetric measurements were performed by three independent observers using the three software packages and compared to direct anthropometric measurements made with calipers and a protractor. Measurements obtained with AutoCAD(r) were the most reproducible and those made with ImageTool(r) were the most similar to direct anthropometry, while measurements with Adobe Photoshop(r) showed the largest differences. Except for angular measurements, significant differences were found between measurements of line segments made using the three software packages and those obtained by direct anthropometry. AutoCAD(r) provided the highest precision and intermediate accuracy; ImageTool(r) had the highest accuracy and lowest precision; and Adobe Photoshop(r) showed intermediate precision and the worst accuracy among the three software packages.

  13. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  14. HDclassif : An R Package for Model-Based Clustering and Discriminant Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Laurent Berge

    2012-01-01

    Full Text Available This paper presents the R package HDclassif which is devoted to the clustering and the discriminant analysis of high-dimensional data. The classification methods proposed in the package result from a new parametrization of the Gaussian mixture model which combines the idea of dimension reduction and model constraints on the covariance matrices. The supervised classification method using this parametrization is called high dimensional discriminant analysis (HDDA. In a similar manner, the associated clustering method iscalled high dimensional data clustering (HDDC and uses the expectation-maximization algorithm for inference. In order to correctly t the data, both methods estimate the specific subspace and the intrinsic dimension of the groups. Due to the constraints on the covariance matrices, the number of parameters to estimate is significantly lower than other model-based methods and this allows the methods to be stable and efficient in high dimensions. Two introductory examples illustrated with R codes allow the user to discover the hdda and hddc functions. Experiments on simulated and real datasets also compare HDDC and HDDA with existing classification methods on high-dimensional datasets. HDclassif is a free software and distributed under the general public license, as part of the R software project.

  15. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  16. Repository environmental parameters and models/methodologies relevant to assessing the performance of high-level waste packages in basalt, tuff, and salt

    Energy Technology Data Exchange (ETDEWEB)

    Claiborne, H.C.; Croff, A.G.; Griess, J.C.; Smith, F.J.

    1987-09-01

    This document provides specifications for models/methodologies that could be employed in determining postclosure repository environmental parameters relevant to the performance of high-level waste packages for the Basalt Waste Isolation Project (BWIP) at Richland, Washington, the tuff at Yucca Mountain by the Nevada Test Site, and the bedded salt in Deaf Smith County, Texas. Guidance is provided on the identify of the relevant repository environmental parameters; the models/methodologies employed to determine the parameters, and the input data base for the models/methodologies. Supporting studies included are an analysis of potential waste package failure modes leading to identification of the relevant repository environmental parameters, an evaluation of the credible range of the repository environmental parameters, and a summary of the review of existing models/methodologies currently employed in determining repository environmental parameters relevant to waste package performance. 327 refs., 26 figs., 19 tabs.

  17. Repository environmental parameters and models/methodologies relevant to assessing the performance of high-level waste packages in basalt, tuff, and salt

    International Nuclear Information System (INIS)

    Claiborne, H.C.; Croff, A.G.; Griess, J.C.; Smith, F.J.

    1987-09-01

    This document provides specifications for models/methodologies that could be employed in determining postclosure repository environmental parameters relevant to the performance of high-level waste packages for the Basalt Waste Isolation Project (BWIP) at Richland, Washington, the tuff at Yucca Mountain by the Nevada Test Site, and the bedded salt in Deaf Smith County, Texas. Guidance is provided on the identify of the relevant repository environmental parameters; the models/methodologies employed to determine the parameters, and the input data base for the models/methodologies. Supporting studies included are an analysis of potential waste package failure modes leading to identification of the relevant repository environmental parameters, an evaluation of the credible range of the repository environmental parameters, and a summary of the review of existing models/methodologies currently employed in determining repository environmental parameters relevant to waste package performance. 327 refs., 26 figs., 19 tabs

  18. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  19. Quantitation of magnetic resonance spectroscopy signals: the jMRUI software package

    International Nuclear Information System (INIS)

    Stefan, D; Andrasescu, A; Cesare, F Di; Popa, E; Lazariev, A; Graveron-Demilly, D; Vescovo, E; Williams, S; Strbak, O; Starcuk, Z; Cabanas, M; Van Ormondt, D

    2009-01-01

    The software package jMRUI with Java-based graphical user interface enables user-friendly time-domain analysis of magnetic resonance spectroscopy (MRS) and spectroscopic imaging (MRSI) and HRMAS-NMR signals. Version 3.x has been distributed in more than 1200 groups or hospitals worldwide. The new version 4.x is a plug-in platform enabling the users to add their own algorithms. Moreover, it offers new functionalities compared to versions 3.x. The quantum-mechanical simulator based on NMR-SCOPE, the quantitation algorithm QUEST and the main MRSI functionalities are described. Quantitation results of signals obtained in vivo from a mouse and a human brain are given

  20. EPILAB: a software package for studies on the prediction of epileptic seizures.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Feldwisch-Drentrup, H; Valderrama, M; Costa, R P; Alvarado-Rojas, C; Nikolopoulos, S; Le Van Quyen, M; Timmer, J; Schelter, B; Dourado, A

    2011-09-15

    A Matlab®-based software package, EPILAB, was developed for supporting researchers in performing studies on the prediction of epileptic seizures. It provides an intuitive and convenient graphical user interface. Fundamental concepts that are crucial for epileptic seizure prediction studies were implemented. This includes, for example, the development and statistical validation of prediction methodologies in long-term continuous recordings. Seizure prediction is usually based on electroencephalography (EEG) and electrocardiography (ECG) signals. EPILAB is able to process both EEG and ECG data stored in different formats. More than 35 time and frequency domain measures (features) can be extracted based on univariate and multivariate data analysis. These features can be post-processed and used for prediction purposes. The predictions may be conducted based on optimized thresholds or by applying classifications methods such as artificial neural networks, cellular neuronal networks, and support vector machines. EPILAB proved to be an efficient tool for seizure prediction, and aims to be a way to communicate, evaluate, and compare results and data among the seizure prediction community. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Science.gov (United States)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  2. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, F. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain)]. E-mail: fimerall@ull.es; Gonzalez-Manrique, S. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Karlsson, L. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Hernandez-Armas, J. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Aparicio, A. [Instituto de Astrofisica de Canarias, 38200 La Laguna, Tenerife (Spain); Departamento de Astrofisica, Universidad de La Laguna. Avenida. Astrofisico Francisco Sanchez s/n, 38071 La Laguna, Tenerife (Spain)

    2007-03-15

    Makrofol detectors are commonly used for long-term radon ({sup 222}Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm{sup -3}htrack{sup -1}cm{sup 2}, has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm{sup -3} and, in two of them, above 400Bqm{sup -3}. Further studies should be performed at those schools following the European Union recommendations about radon concentrations in

  3. Performance-Oriented packaging: A guide to identifying, procuring, and using

    International Nuclear Information System (INIS)

    O'Brien, J.H.

    1992-09-01

    This document guides users through the process of correctly identifying, obtaining, and using performance-oriented packaging. Almost all hazardous material shipments can be made in commercially available performance-oriented packaging. To cover the remaining shipments requiring specially designed packaging, a design guide is being developed. The design guide is scheduled to be issued 1 year after this procurement guide

  4. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  5. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  6. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  7. SISYPHUS: A high performance seismic inversion factory

    Science.gov (United States)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with

  8. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  9. Performance evaluation of the zero-multipole summation method in modern molecular dynamics software.

    Science.gov (United States)

    Sakuraba, Shun; Fukuda, Ikuo

    2018-05-04

    The zero-multiple summation method (ZMM) is a cutoff-based method for calculating electrostatic interactions in molecular dynamics simulations, utilizing an electrostatic neutralization principle as a physical basis. Since the accuracies of the ZMM have been revealed to be sufficient in previous studies, it is highly desirable to clarify its practical performance. In this paper, the performance of the ZMM is compared with that of the smooth particle mesh Ewald method (SPME), where the both methods are implemented in molecular dynamics software package GROMACS. Extensive performance comparisons against a highly optimized, parameter-tuned SPME implementation are performed for various-sized water systems and two protein-water systems. We analyze in detail the dependence of the performance on the potential parameters and the number of CPU cores. Even though the ZMM uses a larger cutoff distance than the SPME does, the performance of the ZMM is comparable to or better than that of the SPME. This is because the ZMM does not require a time-consuming electrostatic convolution and because the ZMM gains short neighbor-list distances due to the smooth damping feature of the pairwise potential function near the cutoff length. We found, in particular, that the ZMM with quadrupole or octupole cancellation and no damping factor is an excellent candidate for the fast calculation of electrostatic interactions. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  10. SEJV2 software package for radiation monitoring system of WWER 440 NPP

    International Nuclear Information System (INIS)

    Kapisovsky, V.; Jancik, O.; Kubik, I.; Bena, J.

    1993-01-01

    The main part of the radiation monitoring system at a WWER-440 (213 reactor type) nuclear power plant is the centralized 400-channel monitoring system 'SEJVAL' servicing twin reactor units. The SEJV2 software package is described developed to run on a PC with an IFS2 interface to the SEJVAL radiation monitoring system. It provides enhanced data presentation, record keeping and report generation, thus improving the efficiency of the health physics shift. The system was for the first time implemented at the Jaslovske Bohunice V-2 nuclear power plant with encouraging results. (Z.S.) 3 refs

  11. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Science.gov (United States)

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A Review of Predictive Software for the Design of Community Microgrids

    Directory of Open Access Journals (Sweden)

    Mina Rahimian

    2018-01-01

    Full Text Available This paper discusses adding a spatial dimension to the design of community microgrid projects in the interest of expanding the existing discourse related to energy performance optimization measures. A multidimensional vision for designing community microgrids with higher energy performance is considered, leveraging urban form (superstructure to understand how it impacts the performance of the system’s distributed energy resources and loads (infrastructure. This vision engages the design sector in the technical conversation of developing community microgrids, leading to energy efficient designs of microgrid-connected communities well before their construction. A new generation of computational modeling and simulation tools that address this interaction are required. In order to position the research, this paper presents a survey of existing software packages, belonging to two distinct categories of modeling, simulation, and evaluation of community microgrids: the energy infrastructure modeling and the urban superstructure energy modeling. Results of this software survey identify a lack in software tools and simulation packages that simultaneously address the necessary interaction between the superstructure and infrastructure of community microgrids, given the importance of its study. Conclusions represent how a proposed experimental software prototype may fill an existing gap in current related software packages.

  13. Fragman: an R package for fragment analysis.

    Science.gov (United States)

    Covarrubias-Pazaran, Giovanny; Diaz-Garcia, Luis; Schlautman, Brandon; Salazar, Walter; Zalapa, Juan

    2016-04-21

    Determination of microsatellite lengths or other DNA fragment types is an important initial component of many genetic studies such as mutation detection, linkage and quantitative trait loci (QTL) mapping, genetic diversity, pedigree analysis, and detection of heterozygosity. A handful of commercial and freely available software programs exist for fragment analysis; however, most of them are platform dependent and lack high-throughput applicability. We present the R package Fragman to serve as a freely available and platform independent resource for automatic scoring of DNA fragment lengths diversity panels and biparental populations. The program analyzes DNA fragment lengths generated in Applied Biosystems® (ABI) either manually or automatically by providing panels or bins. The package contains additional tools for converting the allele calls to GenAlEx, JoinMap® and OneMap software formats mainly used for genetic diversity and generating linkage maps in plant and animal populations. Easy plotting functions and multiplexing friendly capabilities are some of the strengths of this R package. Fragment analysis using a unique set of cranberry (Vaccinium macrocarpon) genotypes based on microsatellite markers is used to highlight the capabilities of Fragman. Fragman is a valuable new tool for genetic analysis. The package produces equivalent results to other popular software for fragment analysis while possessing unique advantages and the possibility of automation for high-throughput experiments by exploiting the power of R.

  14. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    Science.gov (United States)

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  16. Radioactive material package test standards and performance requirements - public perception

    International Nuclear Information System (INIS)

    Pope, R.B.; Shappert, L.B.; Rawl, R.R.

    1992-01-01

    This paper addresses issues related to the public perception of the regulatory test standards and performance requirements for packaging and transporting radioactive material. Specifically, it addresses the adequacy of the package performance standards and testing for Type B packages, which are those packages designed for transporting the most hazardous quantities and forms of radioactive material. Type B packages are designed to withstand accident conditions in transport. To improve public perception, the public needs to better understand: (a) the regulatory standards and requirements themselves, (b) the extensive history underlying their development, and (c) the soundness of the technical foundation. The public needs to be fully informed on studies, tests, and analyses that have been carried out worldwide and form the basis of the regulatory standards and requirements. This paper provides specific information aimed at improving the public perception of packages test standards

  17. Peer Review of the Waste Package Material Performance Interim Report

    International Nuclear Information System (INIS)

    J. A. Beavers; T. M. Devine, Jr.; G. S. Frankel; R. H. Jones; R. G. Kelly; R. M. Latanision; J. H. Payer

    2001-01-01

    At the request of the U.S. Department of Energy, Bechtel SAIC Company, LLC, formed the Waste Package Materials Performance Peer Review Panel (the Panel) to review the technical basis for evaluating the long-term performance of waste package materials in a proposed repository at Yucca Mountain, Nevada. This is the interim report of the Panel; a final report will be issued in February 2002. In its work to date, the Panel has identified important issues regarding waste package materials performance. In the remainder of its work, the Panel will address approaches and plans to resolve these issues. In its review to date, the Panel has not found a technical basis to conclude that the waste package materials are unsuitable for long-term containment at the proposed Yucca Mountain Repository. Nevertheless, significant technical issues remain unsettled and, primarily because of the extremely long life required for the waste packages, there will always be some uncertainty in the assessment. A significant base of scientific and engineering knowledge for assessing materials performance does exist and, therefore, the likelihood is great that uncertainty about the long-term performance can be substantially reduced through further experiments and analysis

  18. A role for relational databases in high energy physics software systems

    International Nuclear Information System (INIS)

    Lauer, R.; Slaughter, A.J.; Wolin, E.

    1987-01-01

    This paper presents the design and initial implementation of software which uses a relational database management system for storage and retrieval of real and Monte Carlo generated events from a charm and beauty spectrometer with a vertex detector. The purpose of the software is to graphically display and interactively manipulate the events, fit tracks and vertices and calculate physics quantities. The INGRES database forms the core of the system, while the DI3000 graphics package is used to plot the events. The paper introduces relational database concepts and their applicability to high energy physics data. It also evaluates the environment provided by INGRES, particularly its usefulness in code development and its Fortran interface. Specifics of the database design we have chosen are detailed as well. (orig.)

  19. Network meta-analysis using R: a review of currently available automated packages.

    Directory of Open Access Journals (Sweden)

    Binod Neupane

    Full Text Available Network meta-analysis (NMA--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i data input and network plotting, (ii modeling options, (iii assumption checking and diagnostic testing, and (iv inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  20. The CASA Software Package

    Science.gov (United States)

    Petry, Dirk

    2018-03-01

    CASA is the standard science data analysis package for ALMA and VLA but it can also be used for the analysis of data from other observatories. In this talk, I will give an overview of the structure and features of CASA, who develops it, and the present status and plans, and then show typical analysis workflows for ALMA data with special emphasis on the handling of single dish data and its combination with interferometric data.

  1. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  2. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  3. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  4. Criteria for the selection of ERP software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The implementation of an ERP software package is an important investment for an organization, which is characterized also by a high degree of risk. Selecting the most appropriate software is a necessary condition for a successful implementation. This paper is describing the major aspects of software selection in general and the relevant criteria in the case of ERP software.

  5. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction

    Directory of Open Access Journals (Sweden)

    Jon Hill

    2014-03-01

    Full Text Available Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1 including new processing steps, such as Safe Taxonomic Reduction, 2 using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3 a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  6. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction.

    Science.gov (United States)

    Hill, Jon; Davis, Katie E

    2014-01-01

    Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  7. High-energy physics software parallelization using database techniques

    International Nuclear Information System (INIS)

    Argante, E.; Van der Stok, P.D.V.; Willers, I.

    1997-01-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradigm, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI. (orig.)

  8. Performance and Reliability of Bonded Interfaces for High-Temperature Packaging. Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    DeVoto, Douglas [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-04-01

    Current generation automotive power electronics packages utilize silicon devices and lead-free solder alloys. To meet stringent technical targets for 2020 and beyond (for cost, power density, specific power, efficiency and reliability), wide-bandgap devices are being considered since they offer advantages such as operation at higher frequencies, voltages, and temperatures. Traditional power electronics packages must be redesigned to utilize the full potential of wide-bandgap devices, and the die- and substrate-attach layers are key areas where new material development and validation is required. Present solder alloys do not meet the performance requirements for these new package designs while also meeting cost and hazardous substance restrictions. Sintered silver (Ag) promises to meet the needs for die- and substrate-attach interfaces but synthesis optimization and reliability evaluation must be completed. Sintered Ag material was proposed as an alternative solution in power electronics packages almost 20 years back. However, synthesis pressure requirements up 40 MPa caused a higher complexity in the production process and more stringent flatness specifications for the substrates. Recently, several manufacturers have developed sintered Ag materials that require lower (3-5 MPa) or even no bonding pressures. Degradation mechanisms for these sintered Ag materials are not well known and need to be addressed. We are addressing these aspects to some extent in this project. We are developing generalized (i.e., independent of geometry) stress intensity factor versus cycles-to-failure relations for sintered Ag. Because sintered Ag is a relatively new material for automotive power electronics, the industry currently does not have a good understanding of recommended synthesis parameters or expected reliability under prescribed conditions. It is an important deliverable of this project to transfer findings to industry to eliminate barriers to using sintered Ag as a viable and

  9. Selection of software for mechanical engineering undergraduates

    International Nuclear Information System (INIS)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S.

    2016-01-01

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  10. Selection of software for mechanical engineering undergraduates

    Energy Technology Data Exchange (ETDEWEB)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S., E-mail: ablicblau@swin.edu.au [Swinburne University of Technology, Faculty of Science Engineering and Technology, PO Box 218 Hawthorn, Victoria, Australia, 3122 (Australia)

    2016-07-12

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  11. High Performance Microaccelerometer with Wafer-level Hermetic Packaged Sensing Element and Continuous-time BiCMOS Interface Circuit

    International Nuclear Information System (INIS)

    Ko, Hyoungho; Park, Sangjun; Paik, Seung-Joon; Choi, Byoung-doo; Park, Yonghwa; Lee, Sangmin; Kim, Sungwook; Lee, Sang Chul; Lee, Ahra; Yoo, Kwangho; Lim, Jaesang; Cho, Dong-il

    2006-01-01

    A microaccelerometer with highly reliable, wafer-level packaged MEMS sensing element and fully differential, continuous time, low noise, BiCMOS interface circuit is fabricated. The MEMS sensing element is fabricated on a (111)-oriented SOI wafer by using the SBM (Sacrificial/Bulk Micromachining) process. To protect the silicon structure of the sensing element and enhance the reliability, a wafer level hermetic packaging process is performed by using a silicon-glass anodic bonding process. The interface circuit is fabricated using 0.8 μm BiCMOS process. The capacitance change of the MEMS sensing element is amplified by the continuous-time, fully-differential transconductance input amplifier. A chopper-stabilization architecture is adopted to reduce low-frequency noise including 1/f noise. The fabricated microaccelerometer has the total noise equivalent acceleration of 0.89 μg/√Hz, the bias instability of 490 μg, the input range of ±10 g, and the output nonlinearity of ±0.5 %FSO

  12. Automated load balancing in the ATLAS high-performance storage software

    CERN Document Server

    Le Goff, Fabrice; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment collects proton-proton collision events delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects, transports and eventually records event data from the detector at several gigabytes per second. The data are recorded on transient storage before being delivered to permanent storage. The transient storage consists of high-performance direct-attached storage servers accounting for about 500 hard drives. The transient storage operates dedicated software in the form of a distributed multi-threaded application. The workload includes both CPU-demanding and IO-oriented tasks. This paper presents the original application threading model for this particular workload, discussing the load-sharing strategy among the available CPU cores. The limitations of this strategy were reached in 2016 due to changes in the trigger configuration involving a new data distribution pattern. We then describe a novel data-driven load-sharing strategy, designed to automatical...

  13. Determination of phthalates released from paper packaging materials by solid-phase extraction-high-performance liquid chromatography.

    Science.gov (United States)

    Gao, Xin; Yang, Bofeng; Tang, Zhixu; Luo, Xin; Wang, Fengmei; Xu, Hui; Cai, Xue

    2014-01-01

    A solid phase extraction (SPE) high-performance liquid chromatography (HPLC) method was developed for the simultaneous determination of 10 phthalic acid esters (dimethyl phthalate, diethyl phthalate, dipropyl phthalate, benzylbutyl phthalate, diisobutyl phthalate, dicyclohexyl phthalate, diamyl phthalate, di-n-hexyl phthalate, di-n-octyl phthalate and di-2-ethylhexyl phthalate) released from food paper packaging materials. The use of distilled water, 3% acetic acid (w/v), 10% ethanol (v/v) and 95% ethanol (v/v) instead of the different types of food simulated the migration of 10 phthalic acid esters from food paper packaging materials; the phthalic acid esters in four food simulants were enriched and purified by a C18 SPE column and nitrogen blowing, and quantified by HPLC with a diode array detector. The chromatographic conditions and extraction conditions were optimized and all 10 of the phthalate acid esters had a maximum absorbance at 224 nm. The method showed limitations of detection in the range of 6.0-23.8 ng/mL the correlation coefficients were greater than 0.9999 in all cases, recovery values ranged between 71.27 and 106.97% at spiking levels of 30, 60 and 90 ng/mL and relative standard deviation values ranged from 0.86 to 8.00%. The method was considered to be simple, fast and reliable for a study on the migration of these 10 phthalic acid esters from food paper packaging materials into food.

  14. SeisFlows-Flexible waveform inversion software

    Science.gov (United States)

    Modrak, Ryan T.; Borisov, Dmitry; Lefebvre, Matthieu; Tromp, Jeroen

    2018-06-01

    SeisFlows is an open source Python package that provides a customizable waveform inversion workflow and framework for research in oil and gas exploration, earthquake tomography, medical imaging, and other areas. New methods can be rapidly prototyped in SeisFlows by inheriting from default inversion or migration classes, and code can be tested on 2D examples before application to more expensive 3D problems. Wave simulations must be performed using an external software package such as SPECFEM3D. The ability to interface with external solvers lends flexibility, and the choice of SPECFEM3D as a default option provides optional GPU acceleration and other useful capabilities. Through support for massively parallel solvers and interfaces for high-performance computing (HPC) systems, inversions with thousands of seismic traces and billions of model parameters can be performed. So far, SeisFlows has run on clusters managed by the Department of Defense, Chevron Corp., Total S.A., Princeton University, and the University of Alaska, Fairbanks.

  15. TopView - ATLAS top physics analysis package

    CERN Document Server

    Shibata, A

    2007-01-01

    TopView is a common analysis package which is widely used in the ATLAS top physics working group. The package is fully based on the official ATLAS software Athena and EventView and playing a central role in the collaborative analysis model. It is a functional package which accounts for a broad range issues in implementing physics analysis. As well as being a modular framework suitable as a common workplace for collaborators, TopView implements numerous analysis tools including a complete top-antitop reconstruction and single top reconstruction. The package is currently used to produce common ntuple from Monte Carlo production and future use cases are under rapid development. In this paper, the design and ideas behind TopView and the performance of the analyses implemented in the package are presented with detailed documentation of the contents and instruction for using the package.

  16. Dynamic modelling and PID loop control of an oil-injected screw compressor package

    Science.gov (United States)

    Poli, G. W.; Milligan, W. J.; McKenna, P.

    2017-08-01

    A significant amount of time is spent tuning the PID (Proportional, Integral and Derivative) control loops of a screw compressor package due to the unique characteristics of the system. Common mistakes incurred during the tuning of a PID control loop include improper PID algorithm selection and unsuitable tuning parameters of the system resulting in erratic and inefficient operation. This paper details the design and development of software that aims to dynamically model the operation of a single stage oil injected screw compressor package deployed in upstream oil and gas applications. The developed software will be used to assess and accurately tune PID control loops present on the screw compressor package employed in controlling the oil pressures, temperatures and gas pressures, in a bid to improve control of the operation of the screw compressor package. Other applications of the modelling software will include its use as an evaluation tool that can estimate compressor package performance during start up, shutdown and emergency shutdown processes. The paper first details the study into the fundamental operational characteristics of each of the components present on the API 619 screw compressor package and then discusses the creation of a dynamic screw compressor model within the MATLAB/Simulink software suite. The paper concludes by verifying and assessing the accuracy of the created compressor model using data collected from physical screw compressor packages.

  17. A process control software package for the SRS

    International Nuclear Information System (INIS)

    Atkins, V.R.; Poole, D.E.; Rawlinson, W.R.

    1980-03-01

    The development of software to give high level access from application programs for monitoring and control of the Daresbury Synchrotron Radiation Source on a network-wide basis is described. The design and implementation of the control system database, a special supervisor call and and 'executive' type task handling of all process input/output services for the 7/32 (which runs under 05/32-MT), and process control 'device driver' software for the 7/16 (run under L5/16-MT) are included. (UK)

  18. A multi-center study benchmarks software tools for label-free proteome quantification

    Science.gov (United States)

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  19. Opensource Software for MLR-Modelling of Solar Collectors

    DEFF Research Database (Denmark)

    Bacher, Peder; Perers, Bengt

    2011-01-01

    A first research version is now in operation of a software package for multiple linear regression (MLR) modeling and analysis of solar collectors according to ideas originating all the way from Walletun et. al. (1986), Perers, (1987 and 1993). The tool has been implemented in the free and open...... source program R http://www.r-project.org/. Applications of the software package includes: visual validation, resampling and conversion of data, collector performance testing analysis according to the European Standard EN 12975 (Fischer et al., 2004), statistical validation of results...

  20. New software for improving performance in wind farm operations

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Mark [Ekho for Wind (Canada)

    2011-07-01

    The performance of wind farms depends on multiple field and business systems. This makes operational planning difficult because of so many data being in separate systems, duplication of data and the impossibility of gathering all relevant data together in one place. The aim of this paper is to present a new software, Ekho for Wind, which helps improve performance in wind farm operations by providing features such as high level views, performance analysis, downtime tracking, quality data management and forecast generation. This new software provides operational intelligence which offers incentives for continuous improvement. Ekho for Wind can bring such benefits as maximization of generation, increased lifetime of assets, minimization of costs and increased profitability. This presentation introduced a new software for improving the performance of wind farms and the lifetime of assets, resulting in significant payback.

  1. DSISoft—a MATLAB VSP data processing package

    Science.gov (United States)

    Beaty, K. S.; Perron, G.; Kay, I.; Adam, E.

    2002-05-01

    DSISoft is a public domain vertical seismic profile processing software package developed at the Geological Survey of Canada. DSISoft runs under MATLAB version 5.0 and above and hence is portable between computer operating systems supported by MATLAB (i.e. Unix, Windows, Macintosh, Linux). The package includes processing modules for reading and writing various standard seismic data formats, performing data editing, sorting, filtering, and other basic processing modules. The processing sequence can be scripted allowing batch processing and easy documentation. A structured format has been developed to ensure future additions to the package are compatible with existing modules. Interactive modules have been created using MATLAB's graphical user interface builder for displaying seismic data, picking first break times, examining frequency spectra, doing f- k filtering, and plotting the trace header information. DSISoft modular design facilitates the incorporation of new processing algorithms as they are developed. This paper gives an overview of the scope of the software and serves as a guide for the addition of new modules.

  2. APFELgrid: a high performance tool for parton density determinations

    CERN Document Server

    Bertone, Valerio; Hartland, Nathan P.

    We present a new software package designed to reduce the computational burden of hadron collider measurements in Parton Distribution Function (PDF) fits. The APFELgrid package converts interpolated weight tables provided by APPLgrid files into a more efficient format for PDF fitting by the combination with PDF and $\\alpha_s$ evolution factors provided by APFEL. This combination significantly reduces the number of operations required to perform the calculation of hadronic observables in PDF fits and simplifies the structure of the calculation into a readily optimised scalar product. We demonstrate that our technique can lead to a substantial speed improvement when compared to existing methods without any reduction in numerical accuracy.

  3. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP experiments.

    Directory of Open Access Journals (Sweden)

    Nicholas D Youngblut

    Full Text Available Combining high throughput sequencing with stable isotope probing (HTS-SIP is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP, multi-window high-resolution stable isotope probing (MW-HR-SIP, quantitative stable isotope probing (qSIP, and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  4. A software package to process an INIS magnetic tape on the VAX computer

    International Nuclear Information System (INIS)

    Omar, A.A.; Mohamed, F.A.

    1991-01-01

    This paper presents a software package whose function is to process the magnetic tapes distributed by the Atomic Energy Agency, on the VAX computers. These tapes contain abstracts of papers in the different branches of nuclear field and is supplied from the international Nuclear Information system (INIS). Two goals are aimed from this paper. First it gives a procedure to process any foreign magnetic tape on the VAX computers. Second, it solves the problem of reading the INIS tapes on a non IBM computer and thus allowing the specialists to gain from the large amount of information contained in these tapes. 11 fig

  5. Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems

    Science.gov (United States)

    Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.

    2011-01-01

    The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.

  6. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    on the Peregrine System Software on the Peregrine System NREL maintains a variety of applications environment modules for use on Peregrine. Applications View list of software applications by name and research area/discipline. Libraries View list of software libraries available for linking and loading

  7. Progress in waste package and engineered barrier system performance assessment and design

    International Nuclear Information System (INIS)

    Van Luik, A.; Stahl, D.; Harrison, D.

    1993-01-01

    As part of the U.S. Department of Energy's evaluation of site suitability for a potential high-level radioactive waste repository, long-term interactions between the engineered barrier system and the site must be determined. This requires a waste-package/engineered-system design, a description of the environment around the emplacement zone, and models that simulate operative processes describing these engineered/natural systems interactions. Candidate designs are being evaluated, including a more robust, multi-barrier waste package, and a drift emplacement mode. Tools for evaluating designs, and emplacement mode are the currently available waste-package/engineered-system performance assessment codes development for the project. For assessments that support site suitability, environmental impact, or licensing decisions, more capable codes are needed. Code capability requirements are being written, and existing codes are to be evaluated against those requirements. Recommendations are being made to focus waste-packaging/engineered-system code-development

  8. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    International Nuclear Information System (INIS)

    Tuura, L.A.; Taylor, L.

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  9. A high performance micro-pressure sensor based on a double-ended quartz tuning fork and silicon diaphragm in atmospheric packaging

    International Nuclear Information System (INIS)

    Cheng, Rongjun; Li, Cun; Zhao, Yulong; Li, Bo; Tian, Bian

    2015-01-01

    A resonant micro-pressure sensor based on a double-ended quartz tuning fork (DEQTF) and bossed silicon diaphragm in atmospheric packaging is presented. To achieve vacuum-free packaging with a high quality factor, the DEQTF is designed to resonate in an anti-phase vibration mode in a plane that is under the effect of slide-film damping. The feasibility is demonstrated by theoretical analysis and a finite element simulation. The dimensions of the DEQTF and diaphragm are optimized in accordance with the principles of improving sensitivity and minimizing energy dissipation. The sensor chip is fabricated using quartz and silicon micromachining technologies, and simply packaged in a stainless steel shell with standard atmosphere. The experimental setup is established for the calibration, where an additional sensor prototype without a pressure port is introduced as a frequency reference. By detecting the frequency difference of the tested sensor and reference sensor, the influences of environmental factors such as temperature and shocks on measuring accuracy are eliminated effectively. Under the action of a self-excitation circuit, static performance is obtained. The sensitivity of the sensor is 299 kHz kPa −1 in the operating range of 0–10 kPa at room temperature. Testing results shows a nonlinearity of 0.0278%FS, a hysteresis of 0.0207%FS and a repeatability of 0.0375%FS. The results indicate that the proposed sensor has favorable features, which provides a cost-effective and high-performance approach for low pressure measurement. (paper)

  10. A comparison of software programs to determine curie content

    International Nuclear Information System (INIS)

    Hansen, C.J.; Miller C.C.

    1995-01-01

    Commercial nuclear power plants have used various methods to determine the curie content of radwaste packages to comply with shipping and disposal regulations. Several computer software packages are available which can determine the curie content of a package based on the geometry of the package and the dose rate of the package provided a given source spectrum. This paper will compare three of the more commonly used software packages. A brief review of the selection and use of software programs at Diablo Canyon Power Plant for radwaste and radioactive material shipments will be provided. These software packages are the PAKRAD program by Bechtel (which utilizes EPRI DOSCON data), RAMSHP by WMG and MICROSHIELD by Grove Engineering. A comparison of the software packages in the calculation of curie content for a box of dry active waste and a cartridge filter will be presented. A summary of program limitations will also be provided

  11. DaMiRseq-an R/Bioconductor package for data mining of RNA-Seq data: normalization, feature selection and classification.

    Science.gov (United States)

    Chiesa, Mattia; Colombo, Gualtiero I; Piacentini, Luca

    2018-04-15

    RNA-Seq is becoming the technique of choice for high-throughput transcriptome profiling, which, besides class comparison for differential expression, promises to be an effective and powerful tool for biomarker discovery. However, a systematic analysis of high-dimensional genomic data is a demanding task for such a purpose. DaMiRseq offers an organized, flexible and convenient framework to remove noise and bias, select the most informative features and perform accurate classification. DaMiRseq is developed for the R environment (R ≥ 3.4) and is released under GPL (≥2) License. The package runs on Windows, Linux and Macintosh operating systems and is freely available to non-commercial users at the Bioconductor open-source, open-development software project repository (https://bioconductor.org/packages/DaMiRseq/). In compliance with Bioconductor standards, the authors ensure stable package maintenance through software and documentation updates. luca.piacentini@ccfm.it. Supplementary data are available at Bioinformatics online.

  12. Investigations into High Temperature Components and Packaging

    Energy Technology Data Exchange (ETDEWEB)

    Marlino, L.D.; Seiber, L.E.; Scudiere, M.B.; M.S. Chinthavali, M.S.; McCluskey, F.P.

    2007-12-31

    The purpose of this report is to document the work that was performed at the Oak Ridge National Laboratory (ORNL) in support of the development of high temperature power electronics and components with monies remaining from the Semikron High Temperature Inverter Project managed by the National Energy Technology Laboratory (NETL). High temperature electronic components are needed to allow inverters to operate in more extreme operating conditions as required in advanced traction drive applications. The trend to try to eliminate secondary cooling loops and utilize the internal combustion (IC) cooling system, which operates with approximately 105 C water/ethylene glycol coolant at the output of the radiator, is necessary to further reduce vehicle costs and weight. The activity documented in this report includes development and testing of high temperature components, activities in support of high temperature testing, an assessment of several component packaging methods, and how elevated operating temperatures would impact their reliability. This report is organized with testing of new high temperature capacitors in Section 2 and testing of new 150 C junction temperature trench insulated gate bipolar transistor (IGBTs) in Section 3. Section 4 addresses some operational OPAL-GT information, which was necessary for developing module level tests. Section 5 summarizes calibration of equipment needed for the high temperature testing. Section 6 details some additional work that was funded on silicon carbide (SiC) device testing for high temperature use, and Section 7 is the complete text of a report funded from this effort summarizing packaging methods and their reliability issues for use in high temperature power electronics. Components were tested to evaluate the performance characteristics of the component at different operating temperatures. The temperature of the component is determined by the ambient temperature (i.e., temperature surrounding the device) plus the

  13. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Science.gov (United States)

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  14. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Directory of Open Access Journals (Sweden)

    Philipp Thomas

    Full Text Available The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA, which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network

  15. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    Science.gov (United States)

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  16. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  17. Spent Fuel Transportation Package Performance Study - Experimental Design Challenges

    International Nuclear Information System (INIS)

    Snyder, A. M.; Murphy, A. J.; Sprung, J. L.; Ammerman, D. J.; Lopez, C.

    2003-01-01

    Numerous studies of spent nuclear fuel transportation accident risks have been performed since the late seventies that considered shipping container design and performance. Based in part on these studies, NRC has concluded that the level of protection provided by spent nuclear fuel transportation package designs under accident conditions is adequate. [1] Furthermore, actual spent nuclear fuel transport experience showcase a safety record that is exceptional and unparalleled when compared to other hazardous materials transportation shipments. There has never been a known or suspected release of the radioactive contents from an NRC-certified spent nuclear fuel cask as a result of a transportation accident. In 1999 the United States Nuclear Regulatory Commission (NRC) initiated a study, the Package Performance Study, to demonstrate the performance of spent fuel and spent fuel packages during severe transportation accidents. NRC is not studying or testing its current regulations, a s the rigorous regulatory accident conditions specified in 10 CFR Part 71 are adequate to ensure safe packaging and use. As part of this study, NRC currently plans on using detailed modeling followed by experimental testing to increase public confidence in the safety of spent nuclear fuel shipments. One of the aspects of this confirmatory research study is the commitment to solicit and consider public comment during the scoping phase and experimental design planning phase of this research

  18. High coherence plane breaking packaging for superconducting qubits

    Science.gov (United States)

    Bronn, Nicholas T.; Adiga, Vivekananda P.; Olivadese, Salvatore B.; Wu, Xian; Chow, Jerry M.; Pappas, David P.

    2018-04-01

    We demonstrate a pogo pin package for a superconducting quantum processor specifically designed with a nontrivial layout topology (e.g., a center qubit that cannot be accessed from the sides of the chip). Two experiments on two nominally identical superconducting quantum processors in pogo packages, which use commercially available parts and require modest machining tolerances, are performed at low temperature (10 mK) in a dilution refrigerator and both found to behave comparably to processors in standard planar packages with wirebonds where control and readout signals come in from the edges. Single- and two-qubit gate errors are also characterized via randomized benchmarking, exhibiting similar error rates as in standard packages, opening the possibility of integrating pogo pin packaging with extensible qubit architectures.

  19. Interim performance specifications for conceptual waste-package designs for geologic isolation in salt repositories

    International Nuclear Information System (INIS)

    1983-06-01

    The interim performance specifications and data requirements presented apply to conceptual waste package designs for all waste forms which will be isolated in salt geologic repositories. The waste package performance specifications and data requirements respond to the waste package performance criteria. Subject areas treated include: containment and controlled release, operational period safety, criticality control, identification, and waste package performance testing requirements. This document was generated for use in the development of conceptual waste package designs in salt. It will be revised as additional data, analyses, and regulatory requirements become available

  20. The equation of state package FEOS for high energy density matter

    Science.gov (United States)

    Faik, Steffen; Tauschwitz, Anna; Iosilevskiy, Igor

    2018-06-01

    Adequate equation of state (EOS) data is of high interest in the growing field of high energy density physics and especially essential for hydrodynamic simulation codes. The semi-analytical method used in the newly developed Frankfurt equation of state (FEOS) package provides an easy and fast access to the EOS of - in principle - arbitrary materials. The code is based on the well known QEOS model (More et al., 1988; Young and Corey, 1995) and is a further development of the MPQeos code (Kemp and Meyer-ter Vehn, 1988; Kemp and Meyer-ter Vehn, 1998) from Max-Planck-Institut für Quantenoptik (MPQ) in Garching Germany. The list of features contains the calculation of homogeneous mixtures of chemical elements and the description of the liquid-vapor two-phase region with or without a Maxwell construction. Full flexibility of the package is assured by its structure: A program library provides the EOS with an interface designed for Fortran or C/C++ codes. Two additional software tools allow for the generation of EOS tables in different file output formats and for the calculation and visualization of isolines and Hugoniot shock adiabats. As an example the EOS of fused silica (SiO2) is calculated and compared to experimental data and other EOS codes.

  1. Validation of geotechnical software for repository performance assessment

    International Nuclear Information System (INIS)

    LeGore, T.; Hoover, J.D.; Khaleel, R.; Thornton, E.C.; Anantatmula, R.P.; Lanigan, D.C.

    1989-01-01

    An important step in the characterization of a high level nuclear waste repository is to demonstrate that geotechnical software, used in performance assessment, correctly models validation. There is another type of validation, called software validation. It is based on meeting the requirements of specifications documents (e.g. IEEE specifications) and does not directly address the correctness of the specifications. The process of comparing physical experimental results with the predicted results should incorporate an objective measure of the level of confidence regarding correctness. This paper reports on a methodology developed that allows the experimental uncertainties to be explicitly included in the comparison process. The methodology also allows objective confidence levels to be associated with the software. In the event of a poor comparison, the method also lays the foundation for improving the software

  2. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  3. ATLAS Software Installation on Supercomputers

    CERN Document Server

    Undrus, Alexander; The ATLAS collaboration

    2018-01-01

    PowerPC and high performance computers (HPC) are important resources for computing in the ATLAS experiment. The future LHC data processing will require more resources than Grid computing, currently using approximately 100,000 cores at well over 100 sites, can provide. Supercomputers are extremely powerful as they use resources of hundreds of thousands CPUs joined together. However their architectures have different instruction sets. ATLAS binary software distributions for x86 chipsets do not fit these architectures, as emulation of these chipsets results in huge performance loss. This presentation describes the methodology of ATLAS software installation from source code on supercomputers. The installation procedure includes downloading the ATLAS code base as well as the source of about 50 external packages, such as ROOT and Geant4, followed by compilation, and rigorous unit and integration testing. The presentation reports the application of this procedure at Titan HPC and Summit PowerPC at Oak Ridge Computin...

  4. DOE progress in assessing the long term performance of waste package materials

    International Nuclear Information System (INIS)

    Berusch, A.; Gause, E.

    1987-01-01

    Under the Nuclear Waste Policy Act of 1982 (NWPA)[1], the US Dept. of Energy (DOE) is conducting activities to select and characterize candidate sites suitable for the construction and operation of a geologic repository for the disposal of high-level nuclear wastes. DOE is funding three first repository projects: Basalt Waste Isolation Project, BWIP; Nevada Nuclear Waste Isolation Project, NNWSI; and Salt Repository Project Office, SRPO. It is essential in the licensing process that DOE demonstrate to the NRC that the long-term performance of the materials and design will be in compliance with the requirements of 10 CFR 60.113 on substantially complete containment within the waste packages for 300 to 1000 years and a controlled release rate from the engineered barrier system (EBS) for 10,000 years of 1 part in 10 5 per year for radionuclides present in defined quantities 100 years after permanent closure. Obviously, the time spans involved make it impractical to base the assessment of the long term performance of waste package materials on real time, prototypical testing. The assessment of performance will be implemented by the use of models that are supported by real time field and laboratory tests, monitoring, and natural analog studies. Each of the repository projects is developing a plan for demonstrating long-term waste package material performance depending on the particular materials and the package-perturbed, time-dependent environment under which the materials must function. An overview of progress in each of these activities for each of the projects is provided in the following

  5. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  6. Development and Evaluation of an Open-Source Software Package “CGITA” for Quantifying Tumor Heterogeneity with Molecular Images

    Directory of Open Access Journals (Sweden)

    Yu-Hua Dean Fang

    2014-01-01

    Full Text Available Background. The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA toolbox, and provide it to the research community as a free, open-source project. Methods. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC cancer patients treated with definitive radiotherapies. Results. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC. Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.. Conclusions. CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.

  7. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  8. White LED with High Package Extraction Efficiency

    International Nuclear Information System (INIS)

    Yi Zheng; Stough, Matthew

    2008-01-01

    The goal of this project is to develop a high efficiency phosphor converting (white) Light Emitting Diode (pcLED) 1-Watt package through an increase in package extraction efficiency. A transparent/translucent monolithic phosphor is proposed to replace the powdered phosphor to reduce the scattering caused by phosphor particles. Additionally, a multi-layer thin film selectively reflecting filter is proposed between blue LED die and phosphor layer to recover inward yellow emission. At the end of the project we expect to recycle approximately 50% of the unrecovered backward light in current package construction, and develop a pcLED device with 80 lm/W e using our technology improvements and commercially available chip/package source. The success of the project will benefit luminous efficacy of white LEDs by increasing package extraction efficiency. In most phosphor-converting white LEDs, the white color is obtained by combining a blue LED die (or chip) with a powdered phosphor layer. The phosphor partially absorbs the blue light from the LED die and converts it into a broad green-yellow emission. The mixture of the transmitted blue light and green-yellow light emerging gives white light. There are two major drawbacks for current pcLEDs in terms of package extraction efficiency. The first is light scattering caused by phosphor particles. When the blue photons from the chip strike the phosphor particles, some blue light will be scattered by phosphor particles. Converted yellow emission photons are also scattered. A portion of scattered light is in the backward direction toward the die. The amount of this backward light varies and depends in part on the particle size of phosphors. The other drawback is that yellow emission from phosphor powders is isotropic. Although some backward light can be recovered by the reflector in current LED packages, there is still a portion of backward light that will be absorbed inside the package and further converted to heat. Heat generated

  9. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  10. High-performance mass storage system for workstations

    Science.gov (United States)

    Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.

    1993-01-01

    media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).

  11. Supporting Early Math--Rationales and Requirements for High Quality Software

    Science.gov (United States)

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  12. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  13. Modified Hazard Ranking System/Hazard Ranking System for sites with mixed radioactive and hazardous wastes: Software documentation

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Peloquin, R.A.; Hawley, K.A.

    1986-11-01

    The mHRS/HRS software package was developed by the Pacific Northwest Laboratory (PNL) under contract with the Department of Energy (DOE) to provide a uniform method for DOE facilities to use in performing their Conservation Environmental Response Compensation and Liability Act (CERCLA) Phase I Modified Hazard Ranking System or Hazard Ranking System evaluations. The program is designed to remove the tedium and potential for error associated with the performing of hand calculations and the interpreting of information on tables and in reference books when performing an evaluation. The software package is designed to operate on a microcomputer (IBM PC, PC/XT, or PC/AT, or a compatible system) using either a dual floppy disk drive or a hard disk storage system. It is written in the dBASE III language and operates using the dBASE III system. Although the mHRS/HRS software package was developed for use at DOE facilities, it has direct applicability to the performing of CERCLA Phase I evaluations for any facility contaminated by hazardous waste. The software can perform evaluations using either the modified hazard ranking system methodology developed by DOE/PNL, the hazard ranking system methodology developed by EPA/MITRE Corp., or a combination of the two. This document is a companion manual to the mHRS/HRS user manual. It is intended for the programmer who must maintain the software package and for those interested in the computer implementation. This manual documents the system logic, computer programs, and data files that comprise the package. Hardware and software implementation requirements are discussed. In addition, hand calculations of three sample situations (problems) with associated computer runs used for the verification of program calculations are included.

  14. Modified Hazard Ranking System/Hazard Ranking System for sites with mixed radioactive and hazardous wastes: Software documentation

    International Nuclear Information System (INIS)

    Stenner, R.D.; Peloquin, R.A.; Hawley, K.A.

    1986-11-01

    The mHRS/HRS software package was developed by the Pacific Northwest Laboratory (PNL) under contract with the Department of Energy (DOE) to provide a uniform method for DOE facilities to use in performing their Conservation Environmental Response Compensation and Liability Act (CERCLA) Phase I Modified Hazard Ranking System or Hazard Ranking System evaluations. The program is designed to remove the tedium and potential for error associated with the performing of hand calculations and the interpreting of information on tables and in reference books when performing an evaluation. The software package is designed to operate on a microcomputer (IBM PC, PC/XT, or PC/AT, or a compatible system) using either a dual floppy disk drive or a hard disk storage system. It is written in the dBASE III language and operates using the dBASE III system. Although the mHRS/HRS software package was developed for use at DOE facilities, it has direct applicability to the performing of CERCLA Phase I evaluations for any facility contaminated by hazardous waste. The software can perform evaluations using either the modified hazard ranking system methodology developed by DOE/PNL, the hazard ranking system methodology developed by EPA/MITRE Corp., or a combination of the two. This document is a companion manual to the mHRS/HRS user manual. It is intended for the programmer who must maintain the software package and for those interested in the computer implementation. This manual documents the system logic, computer programs, and data files that comprise the package. Hardware and software implementation requirements are discussed. In addition, hand calculations of three sample situations (problems) with associated computer runs used for the verification of program calculations are included

  15. MVPACK: a package for the computer-aided design of multivariable control systems

    International Nuclear Information System (INIS)

    Mensah, S.

    1984-01-01

    The design and analysis of high performing controllers for large complex plants require a collection of interactive, powerful computer software. MVPACK, an open-ended package for the computer aided design of control systems has been developed in the Reactor Control Branch of the Chalk River Nuclear Laboratories. The package is fully interactive, and includes a comprehensive state-of-the-art mathematical library to support development of complex multivariable control algorithms. Coded in RATFOR, MVPACK operates with a flexible data structure which makes efficient use of minicomputer resources and provides a standard framework for program generation. The existence of a help mechanism enhances the simplicity of package utilization. This report provides the technical description of the package. It reviews the specifications used in the design and implementation of the package. The database structure, the supporting libraries and the design and analysis modules of MVPACK are described. The report includes several application examples to illustrate the capability of the package. Experience with MVPACK shows that the package provides a synergistic environment for control and regulation systems design, and that it is a unique tool in training of control system engineers

  16. A customizable software for fast reduction and analysis of large X-ray scattering data sets: applications of the new DPDAK package to small-angle X-ray scattering and grazing-incidence small-angle X-ray scattering.

    Science.gov (United States)

    Benecke, Gunthard; Wagermaier, Wolfgang; Li, Chenghao; Schwartzkopf, Matthias; Flucke, Gero; Hoerth, Rebecca; Zizak, Ivo; Burghammer, Manfred; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Trebbin, Martin; Förster, Stephan; Paris, Oskar; Roth, Stephan V; Fratzl, Peter

    2014-10-01

    X-ray scattering experiments at synchrotron sources are characterized by large and constantly increasing amounts of data. The great number of files generated during a synchrotron experiment is often a limiting factor in the analysis of the data, since appropriate software is rarely available to perform fast and tailored data processing. Furthermore, it is often necessary to perform online data reduction and analysis during the experiment in order to interactively optimize experimental design. This article presents an open-source software package developed to process large amounts of data from synchrotron scattering experiments. These data reduction processes involve calibration and correction of raw data, one- or two-dimensional integration, as well as fitting and further analysis of the data, including the extraction of certain parameters. The software, DPDAK (directly programmable data analysis kit), is based on a plug-in structure and allows individual extension in accordance with the requirements of the user. The article demonstrates the use of DPDAK for on- and offline analysis of scanning small-angle X-ray scattering (SAXS) data on biological samples and microfluidic systems, as well as for a comprehensive analysis of grazing-incidence SAXS data. In addition to a comparison with existing software packages, the structure of DPDAK and the possibilities and limitations are discussed.

  17. Software needs engineering - a position paper

    OpenAIRE

    GRIMSON, JANE BARCLAY

    2000-01-01

    PUBLISHED When the general press refers to `software' in its headlines, then this is often not to relate a success story, but to expand on yet another `software-risk-turned-problem-story'. For many people, the term `software' evokes the image of an application package running either on a PC or some similar stand-alone usage. Over 70% of all software, however, are not developed in the traditional software houses as part of the creation of such packages. Much of this software comes in the fo...

  18. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  19. Software for Managing Personal Files.

    Science.gov (United States)

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  20. ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.

    Science.gov (United States)

    Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie

    2018-03-01

    ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .

  1. Mirion--a software package for automatic processing of mass spectrometric images.

    Science.gov (United States)

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  2. The Caviar software package for the astrometric reduction of Cassini ISS images: description and examples

    Science.gov (United States)

    Cooper, N. J.; Lainey, V.; Meunier, L.-E.; Murray, C. D.; Zhang, Q.-F.; Baillie, K.; Evans, M. W.; Thuillot, W.; Vienne, A.

    2018-02-01

    Aims: Caviar is a software package designed for the astrometric measurement of natural satellite positions in images taken using the Imaging Science Subsystem (ISS) of the Cassini spacecraft. Aspects of the structure, functionality, and use of the software are described, and examples are provided. The integrity of the software is demonstrated by generating new measurements of the positions of selected major satellites of Saturn, 2013-2016, along with their observed minus computed (O-C) residuals relative to published ephemerides. Methods: Satellite positions were estimated by fitting a model to the imaged limbs of the target satellites. Corrections to the nominal spacecraft pointing were computed using background star positions based on the UCAC5 and Tycho2 star catalogues. UCAC5 is currently used in preference to Gaia-DR1 because of the availability of proper motion information in UCAC5. Results: The Caviar package is available for free download. A total of 256 new astrometric observations of the Saturnian moons Mimas (44), Tethys (58), Dione (55), Rhea (33), Iapetus (63), and Hyperion (3) have been made, in addition to opportunistic detections of Pandora (20), Enceladus (4), Janus (2), and Helene (5), giving an overall total of 287 new detections. Mean observed-minus-computed residuals for the main moons relative to the JPL SAT375 ephemeris were - 0.66 ± 1.30 pixels in the line direction and 0.05 ± 1.47 pixels in the sample direction. Mean residuals relative to the IMCCE NOE-6-2015-MAIN-coorb2 ephemeris were -0.34 ± 0.91 pixels in the line direction and 0.15 ± 1.65 pixels in the sample direction. The reduced astrometric data are provided in the form of satellite positions for each image. The reference star positions are included in order to allow reprocessing at some later date using improved star catalogues, such as later releases of Gaia, without the need to re-estimate the imaged star positions. The Caviar software is available for free download from: ftp://ftp://ftp.imcce.fr/pub/softwares

  3. A cross-validation package driving Netica with python

    Science.gov (United States)

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  4. Camac Software for TJ-I and TJ-IU

    International Nuclear Information System (INIS)

    Milligen, B. Ph. van.

    1994-01-01

    A user-friendly software package for control of CAMAC data acquisition modules for the TJ-I and TJ-IU experiments at the Association CIEMAT para Fusion has been developed. The CAMAC control software operates in Synchronization with the pre-existing VME-based data-acquisition system. The control software controls the setup of the CAMAC modules and manages the data flow from the taking to the storage of data. Data file management is performed largely automatically. Further, user software is provided for viewing and analysing the data

  5. Photons, photosynthesis, and high-performance computing: challenges, progress, and promise of modeling metabolism in green algae

    International Nuclear Information System (INIS)

    Chang, C H; Graf, P; Alber, D M; Kim, K; Murray, G; Posewitz, M; Seibert, M

    2008-01-01

    The complexity associated with biological metabolism considered at a kinetic level presents a challenge to quantitative modeling. In particular, the relatively sparse knowledge of parameters for enzymes with known kinetic responses is problematic. The possible space of these parameters is of high-dimension, and sampling of such a space typifies a combinatorial explosion of possible dynamic states. However, with sufficient quantitative transcriptomics, proteomics, and metabolomics data at hand, these challenges could be met by high-performance software with sampling, fitting, and optimization capabilities. With this in mind, we present the High-Performance Systems Biology Toolkit HiPer SBTK, an evolving software package to simulate, fit, and optimize metabolite concentrations and fluxes within the space of rate and binding parameters associated with detailed enzyme kinetic models. We present our chosen modeling paradigm for the formulation of metabolic pathway models, the means to address the challenge of representing such models in a precise and persistent fashion using the standardized Systems Biology Markup Language, and our second-generation model of H2-associated Chlamydomonas metabolism. Processing of such models for hierarchically parallelized simulation and optimization, job specification by the user through a GUI interface, software capabilities and initial scaling data, and the mapping of the computation to biological questions is also discussed. Moreover, we present near-term future software and model development goals

  6. Communication Software Performance for Linux Clusters with Mesh Connections

    Energy Technology Data Exchange (ETDEWEB)

    Jie Chen; William Watson

    2003-09-01

    Recent progress in copper based commodity Gigabit Ethernet interconnects enables constructing clusters to achieve extremely high I/O bandwidth at low cost with mesh connections. However, the TCP/IP protocol stack cannot match the improved performance of Gigabit Ethernet networks especially in the case of multiple interconnects on a single host. In this paper, we evaluate and compare the performance characteristics of TCP/IP and M-VIA software that is an implementation of VIA.In particular, we focus on the performance of the software systems for a mesh communication architecture and demonstrate the feasibility of using multiple Gigabit Ethernet cards on one host to achieve aggregated bandwidth and latency that are not only better than what TCP provides but also compare favorably to some of the special purpose high-speed networks. In addition, implementation of a new M-VIA driver for one type of Gigabit Ethernet card will be discussed.

  7. [Simultaneous determination of six fluorescent whitening agents in plastic and paper packaging materials by high performance liquid chromatography].

    Science.gov (United States)

    Zhang, Juzhou; Ji, Shuilin; Cai, Huimei; Li, Jing; Wang, Yongxin; Wang, Jingqiu

    2017-11-08

    A novel analytical method was developed for the simultaneous determination of six fluorescent whitening agents (FWAs:FWA 135, FWA 184, FWA 185, FWA 199, FWA 378 and FWA 393) in paper and plastic food packaging materials by high performance liquid chromatography with fluorescence detection (HPLC-FLD). The sample was extracted with mixed solution of chloroform and acetonitrile (3:7, v/v), then cleaned up by HLB solid phase extraction column. Qualitative and quantitative analyses were carried out by HPLC. The sample was separated on a Phenomenex C18 column using acetonitrile and 5 mmol/L ammonium acetate aqueous solution as mobile phases. The results indicated that the linear range of FWA393 was 15-1500 μg/L and the linear ranges of the other five FWAs were 5-500 μg/L with correlation coefficients greater than 0.999. The recoveries in spiked samples were between 80.4% and 125.0% with RSDs ( n =6) of 1%-13%. Furthermore, this method was applied to analyze 12 samples in the market to verify the practicality of the method. The method showed the advantages of simplicity, high recovery and good precision, and is suitable for the detection of the six fluorescent whitening agents in food packaging materials.

  8. 33rd International School of Mathematics "G Stampacchia ": High Performance Algorithms and Software for Nonlinear Optics "Ettore Majorana"

    CERN Document Server

    Murli, Almerico; High Performance Algorithms and Software for Nonlinear Optics

    2003-01-01

    This volume contains the edited texts of the lectures presented at the Workshop on High Performance Algorithms and Software for Nonlinear Optimization held in Erice, Sicily, at the "G. Stampacchia" School of Mathematics of the "E. Majorana" Centre for Scientific Culture, June 30 - July 8, 2001. In the first year of the new century, the aim of the Workshop was to assess the past and to discuss the future of Nonlinear Optimization, and to highlight recent achieve­ ments and promising research trends in this field. An emphasis was requested on algorithmic and high performance software developments and on new computational experiences, as well as on theoretical advances. We believe that such goal was basically achieved. The Workshop was attended by 71 people from 22 countries. Although not all topics were covered, the presentations gave indeed a wide overview of the field, from different and complementary stand­ points. Besides the lectures, several formal and informal discussions took place. We wish ...

  9. GENII Version 2 Software Design Document

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  10. “DETECTION ARTIFACTS” SOFTWARE PACKAGE: FUNCTIONAL CAPABILITIES AND PROSPECTS OF USING (ON THE EXAMPLE OF GEOARCHEOLOGICAL RESEARCH

    Directory of Open Access Journals (Sweden)

    Ye. P. Krupochkin

    2017-01-01

    Full Text Available Mathematical and scientific methods are highly significant in modern geoarcheological study. They contribute to the development of new computer technologies and their implementing in geoarcheological research in particular, decoding and photogrammetric processing of space images.The article focuses on the “Detection Artifacts”software package designed for thematic aerospace image decoding which is aimed at making the search automatic for various archeological sites, both natural and artificially created ones. The main attention is drawn to decoding of archeological sites using methods of morphological analysis and indicative decoding.Its work is based on two groups of methods of image computer processing: 1 an image enhancement method which is carried out with the help of spatial frequency filtration, and 2 a method of morphometric analysis. The methods of spatial frequency filtration can be used to solve two problems: information noise minimization and edge enhancement. To achieve the best results using the methods of spatial frequency filtration it is necessary to have all the information of relevance to the objects of searching.Searching for various archeological sites is not only photogrammetric task. In fact, this problem can be solved in the sphere of photogrammetry with the application of aerospace and computer methods. The authors stress the idea in order to avoid terminology ambiguity and confusion when describing the essence of the methods and processes. It should be noted that the work with the images must be executed in a strict sequence. First and foremost, photogrammetric processing – atmospheric correction, geometric adjustment, conversion and geo targeting should be implemented. And only after that one can proceed to decoding the information.When creating the software package a modular structure was applied that favorably affected the tasks being solved and corresponded to the conception of search for archaeological objects

  11. XMRF: an R package to fit Markov Networks to high-throughput genetics data.

    Science.gov (United States)

    Wan, Ying-Wooi; Allen, Genevera I; Baker, Yulia; Yang, Eunho; Ravikumar, Pradeep; Anderson, Matthew; Liu, Zhandong

    2016-08-26

    Technological advances in medicine have led to a rapid proliferation of high-throughput "omics" data. Tools to mine this data and discover disrupted disease networks are needed as they hold the key to understanding complicated interactions between genes, mutations and aberrations, and epi-genetic markers. We developed an R software package, XMRF, that can be used to fit Markov Networks to various types of high-throughput genomics data. Encoding the models and estimation techniques of the recently proposed exponential family Markov Random Fields (Yang et al., 2012), our software can be used to learn genetic networks from RNA-sequencing data (counts via Poisson graphical models), mutation and copy number variation data (categorical via Ising models), and methylation data (continuous via Gaussian graphical models). XMRF is the only tool that allows network structure learning using the native distribution of the data instead of the standard Gaussian. Moreover, the parallelization feature of the implemented algorithms computes the large-scale biological networks efficiently. XMRF is available from CRAN and Github ( https://github.com/zhandong/XMRF ).

  12. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    Science.gov (United States)

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  13. An analytical one-dimensional model for predicting waste package performance

    International Nuclear Information System (INIS)

    Relyea, J.F.; Wood, M.I.

    1984-01-01

    A method for allocating waste package performance requirements among waste package components with regard to radionuclide isolation has been developed. Modification or change in this approach can be expected as the understanding of radionuclide behavior in the waste package improves. Thus, the performance requirements derived in this document are preliminary and subject to change. However, this kind of analysis is a useful starting point. It has also proved useful for identifying a small group of radionuclides which should be emphasized in a laboratory experimental program designed to characterize the behavior of specific radionuclides in the waste package environment. A simple one-dimensional, two media transport model has been derived and used to calculate radionuclide transport from the waste form-packing material interface of the waste package into the host rock. Cumulative release over 10,000 years, maximum yearly releases and release rates at the packing material-host rock interface were evaluated on a radionuclide-by radionuclide basis. The major parameters controlling radionuclide release were found to be: radionuclide solubility, porosity of the rock, isotopic ratio of the radionuclide and surface area of the waste form-packing material interface. 15 refs., 2 figs., 16 tabs

  14. P-SPARSLIB: A parallel sparse iterative solution package

    Energy Technology Data Exchange (ETDEWEB)

    Saad, Y. [Univ. of Minnesota, Minneapolis, MN (United States)

    1994-12-31

    Iterative methods are gaining popularity in engineering and sciences at a time where the computational environment is changing rapidly. P-SPARSLIB is a project to build a software library for sparse matrix computations on parallel computers. The emphasis is on iterative methods and the use of distributed sparse matrices, an extension of the domain decomposition approach to general sparse matrices. One of the goals of this project is to develop a software package geared towards specific applications. For example, the author will test the performance and usefulness of P-SPARSLIB modules on linear systems arising from CFD applications. Equally important is the goal of portability. In the long run, the author wishes to ensure that this package is portable on a variety of platforms, including SIMD environments and shared memory environments.

  15. The Next Generation in Subsidence and Aquifer-System Compaction Modeling within the MODFLOW Software Family: A New Package for MODFLOW-2005 and MODFLOW-OWHM

    Science.gov (United States)

    Boyce, S. E.; Leake, S. A.; Hanson, R. T.; Galloway, D. L.

    2015-12-01

    The Subsidence and Aquifer-System Compaction Packages, SUB and SUB-WT, for MODFLOW are two currently supported subsidence packages within the MODFLOW family of software. The SUB package allows the calculation of instantaneous and delayed releases of water from distributed interbeds (relatively more compressible fine-grained sediments) within a saturated aquifer system or discrete confining beds. The SUB-WT package does not include delayed releases, but does perform a more rigorous calculation of vertical stresses that can vary the effective stress that causes compaction. This calculation of instantaneous compaction can include the effect of water-table fluctuations for unconfined aquifers on effective stress, and can optionally adjust the elastic and inelastic storage properties based on the changes in effective stress. The next generation of subsidence modeling in MODFLOW is under development, and will merge and enhance the capabilities of the SUB and SUB-WT Packages for MODFLOW-2005 and MODFLOW-OWHM. This new version will also provide some additional features such as stress dependent vertical hydraulic conductivity of interbeds, time-varying geostatic loads, and additional attributes related to aquifer-system compaction and subsidence that will broaden the class of problems that can be simulated. The new version will include a redesigned source code, a new user friendly input file structure, more output options, and new subsidence solution options. This presentation will discuss progress in developing the new package and the new features being implemented and their potential applications. By Stanley Leake, Scott E. Boyce, Randall T. Hanson, and Devin Galloway

  16. Some design constraints required for the use of generic software in embedded systems: Packages which manage abstract dynamic structures without the need for garbage collection

    Science.gov (United States)

    Johnson, Charles S.

    1986-01-01

    The embedded systems running real-time applications, for which Ada was designed, require their own mechanisms for the management of dynamically allocated storage. There is a need for packages which manage their own internalo structures to control their deallocation as well, due to the performance implications of garbage collection by the KAPSE. This places a requirement upon the design of generic packages which manage generically structured private types built-up from application-defined input types. These kinds of generic packages should figure greatly in the development of lower-level software such as operating systems, schedulers, controllers, and device driver; and will manage structures such as queues, stacks, link-lists, files, and binary multary (hierarchical) trees. Controlled to prevent inadvertent de-designation of dynamic elements, which is implicit in the assignment operation A study was made of the use of limited private type, in solving the problems of controlling the accumulation of anonymous, detached objects in running systems. The use of deallocator prodecures for run-down of application-defined input types during deallocation operations during satellites.

  17. When to Renew Software Licences at HPC Centres? A Mathematical Analysis

    Science.gov (United States)

    Baolai, Ge; MacIsaac, Allan B.

    2010-11-01

    In this paper we study a common problem faced by many high performance computing (HPC) centres: When and how to renew commercial software licences. Software vendors often sell perpetual licences along with forward update and support contracts at an additional, annual cost. Every year or so, software support personnel and the budget units of HPC centres are required to make the decision of whether or not to renew such support, and usually such decisions are made intuitively. The total cost for a continuing support contract can, however, be costly. One might therefore want a rational answer to the question of whether the option for a renewal should be exercised and when. In an attempt to study this problem within a market framework, we present the mathematical problem derived for the day to day operation of a hypothetical HPC centre that charges for the use of software packages. In the mathematical model, we assume that the uncertainty comes from the demand, number of users using the packages, as well as the price. Further we assume the availability of up to date software versions may also affect the demand. We develop a renewal strategy that aims to maximize the expected profit from the use the software under consideration. The derived problem involves a decision tree, which constitutes a numerical procedure that can be processed in parallel.

  18. When to Renew Software Licences at HPC Centres? A Mathematical Analysis

    International Nuclear Information System (INIS)

    Baolai, Ge; MacIsaac, Allan B

    2010-01-01

    In this paper we study a common problem faced by many high performance computing (HPC) centres: When and how to renew commercial software licences. Software vendors often sell perpetual licences along with forward update and support contracts at an additional, annual cost. Every year or so, software support personnel and the budget units of HPC centres are required to make the decision of whether or not to renew such support, and usually such decisions are made intuitively. The total cost for a continuing support contract can, however, be costly. One might therefore want a rational answer to the question of whether the option for a renewal should be exercised and when. In an attempt to study this problem within a market framework, we present the mathematical problem derived for the day to day operation of a hypothetical HPC centre that charges for the use of software packages. In the mathematical model, we assume that the uncertainty comes from the demand, number of users using the packages, as well as the price. Further we assume the availability of up to date software versions may also affect the demand. We develop a renewal strategy that aims to maximize the expected profit from the use the software under consideration. The derived problem involves a decision tree, which constitutes a numerical procedure that can be processed in parallel.

  19. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  20. Strategies employed for LHC software performance studies

    CERN Document Server

    Nowak, A

    2010-01-01

    The objective of this work is to collect and assess the software performance related strategies employed by the major players in the LHC software arena: the four main experiments (ALICE, ATLAS, CMS and LHCb) and the two main software frameworks (Geant4 and ROOT). As the software used differs between the parties, so do the directions and methods in optimization, and their intensity. The common feeling shared by nearly all interviewed parties is that performance is not one of their top priorities and that maintaining it at a constant level is a satisfactory solution, given the resources at hand. In principle, despite some organized efforts, a less structured approach seems to be the dominant one, and opportunistic optimization prevails. Four out of six surveyed groups are investigating memory management related effects, deemed to be the primary cause of their performance issues. The most commonly used tools include Valgrind and homegrown software. All questioned groups expressed the desire for advanced tools, s...

  1. CAMAC Software for TJ-I and TJ-IU

    Energy Technology Data Exchange (ETDEWEB)

    Milligen, B Ph. van

    1993-07-01

    A user-friendly software package for control of CAMAC data acquisition modules for the TJ-I and TJ-IU experiments at the Asociacion CIEMAT para Fusion has been developed. The CAMAC control software operates in synchronisation with the pre-existing VME-based data acquisition system. The control software controls the setup of the CAMAC modules and manages the data flow from the lacking to the storage of data. Data file management is performed largely automatically. Further, user software is provided for viewing and analysing the data. (Author) 9 refs.

  2. CAMAC Software for TJ-I and TJ-IU

    International Nuclear Information System (INIS)

    Milligen, B. Ph. van

    1994-01-01

    A user-friendly software package for control of CAMAC data acquisition modules for the TJ-I and TJ-IU experiments at the Asociacion CIEMAT para Fusion has been developed. The CAMAC control software operates in synchronisation with the pre-existing VME-based data acquisition system. The control software controls the setup of the CAMAC modules and manages the data flow from the lacking to the storage of data. Data file management is performed largely automatically. Further, user software is provided for viewing and analysing the data. (Author) 9 refs

  3. Development of a software system for spatial resolved trace analysis of high performance materials with SIMS

    International Nuclear Information System (INIS)

    Brunner, Ch. H.

    1997-09-01

    The following work is separated into two distinctly different parts. The first one is dealing with the SIMSScan software project, an application system for secondary ion mass spectrometry. This application system primarily lays down the foundation, for the research activity introduced in the second part of this work. SIMSScan is an application system designed to provide data acquisition routines for different requirements in the field of secondary ion mass spectroscopy. The whole application package is divided into three major sections, each one dealing with specific measurement tasks. Various supporting clients and wizards, providing extended functionality to the main application, build the core of the software. The MassScan as well as the DepthScan module incorporate the SIMS in the direct imaging or stigmatic mode and are featuring the capabilities for mass spectra recording or depth profile analysis. In combination with an image recording facility the DepthScan module features the capability of spatial resolved material analysis - 3D SIMS. The RasterScan module incorporates the SIMS in scanning mode and supports an fiber optical link for optimized data transfer. The primary goal of this work is to introduce the basic ideas behind the implementation of the main application modules and the supporting clients. Furthermore, it is the intention to lay down the foundation for further developments. At the beginning a short introduction into the paradigm of object oriented programming as well as Windows TM programming is given. Besides explaining the basic ideas behind the Doc/View application architecture the focus is mainly shifted to the routines controlling the SIMS hardware and the basic concepts of multithreaded programming. The elementary structures of the view and document objects is discussed in detail only for the MassScan module, because the ideas behind data abstraction and encapsulation are quite similar. The second part introduces the research activities

  4. Software Application Profile: PHESANT: a tool for performing automated phenome scans in UK Biobank.

    Science.gov (United States)

    Millard, Louise A C; Davies, Neil M; Gaunt, Tom R; Davey Smith, George; Tilling, Kate

    2017-10-05

    Epidemiological cohorts typically contain a diverse set of phenotypes such that automation of phenome scans is non-trivial, because they require highly heterogeneous models. For this reason, phenome scans have to date tended to use a smaller homogeneous set of phenotypes that can be analysed in a consistent fashion. We present PHESANT (PHEnome Scan ANalysis Tool), a software package for performing comprehensive phenome scans in UK Biobank. PHESANT tests the association of a specified trait with all continuous, integer and categorical variables in UK Biobank, or a specified subset. PHESANT uses a novel rule-based algorithm to determine how to appropriately test each trait, then performs the analyses and produces plots and summary tables. The PHESANT phenome scan is implemented in R. PHESANT includes a novel Javascript D3.js visualization and accompanying Java code that converts the phenome scan results to the required JavaScript Object Notation (JSON) format. PHESANT is available on GitHub at [https://github.com/MRCIEU/PHESANT]. Git tag v0.5 corresponds to the version presented here. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  5. iCosmo: an interactive cosmology package

    Science.gov (United States)

    Refregier, A.; Amara, A.; Kitching, T. D.; Rassat, A.

    2011-04-01

    Aims: The interactive software package iCosmo, designed to perform cosmological calculations is described. Methods: iCosmo is a software package to perfom interactive cosmological calculations for the low-redshift universe. Computing distance measures, the matter power spectrum, and the growth factor is supported for any values of the cosmological parameters. It also computes derived observed quantities for several cosmological probes such as cosmic shear, baryon acoustic oscillations, and type Ia supernovae. The associated errors for these observable quantities can be derived for customised surveys, or for pre-set values corresponding to current or planned instruments. The code also allows for calculation of cosmological forecasts with Fisher matrices, which can be manipulated to combine different surveys and cosmological probes. The code is written in the IDL language and thus benefits from the convenient interactive features and scientific libraries available in this language. iCosmo can also be used as an engine to perform cosmological calculations in batch mode, and forms a convenient adaptive platform for the development of further cosmological modules. With its extensive documentation, it may also serve as a useful resource for teaching and for newcomers to the field of cosmology. Results: The iCosmo package is described with a number of examples and command sequences. The code is freely available with documentation at http://www.icosmo.org, along with an interactive web interface and is part of the Initiative for Cosmology, a common archive for cosmological resources.

  6. The portability of the "Electronics Workbench" simulation software to China

    NARCIS (Netherlands)

    Collis, Betty; Zhi-Cheng, Dong

    1993-01-01

    This article discusses the portability of the Canadian-made simulation software package, "Electronic Workbench" package (EWB) to China. As part of a larger project investigating the portability of various educational software packages, the EWB package was used in electronics instruction in China and

  7. Optical Thermal Characterization Enables High-Performance Electronics Applications

    Energy Technology Data Exchange (ETDEWEB)

    2016-02-01

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  8. Data processing software for purex plant process control laboratory

    International Nuclear Information System (INIS)

    Kansara, V.P.; Achuthan, P.V.; Sridhar, S.; Ramanujam, A.; Dhumwad, R.K.

    1990-01-01

    A software has been developed at the Fuel Reprocessing Division, Trombay to meet the data processing needs of the Control Laboratory of a reprocessing plant. During the normal plant operations contents of over one hundred process tanks have to be sampled and analysed for regular monitoring. In order to speed up the computation and the reporting of results as well as to obtain the process performance data over a period of time a software has been developed. The package has been sucessfully demonstrated and implemented at the Plutonium Plant, Trombay. This has been in continuous use since May 1987 with highly satisfactory performance. The software is a totally menu-driven package which can be used by the laboratory analysts with a few hours of training. The features include data validation involving source tank identification, the nature of the sample, the range of expected results, any duplication in sample numbering etc. Audio indication of deviations from the expected input or output values are given with an option to override in case of abnormal samples. The progress of analysis can be obtained for a given sample at any given time. Incorporated in the software is the help menu for quick reference of analytical protocol to be followed for a given tank/method. The computations for the determinations are carried out after obtaining input values on a screen-form. Th e results can be displayed on the monitor or obtained in the form of a hard copy i n any desired format. (author). 17 figs., 2 refs

  9. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  10. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  11. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    Science.gov (United States)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the

  12. Advanced organics for electronic substrates and packages

    CERN Document Server

    Fletcher, Andrew E

    1992-01-01

    Advanced Organics for Electronic Substrates and Packages provides information on packaging, which is one of the most technologically intensive activities in the electronics industry. The electronics packaging community has realized that while semiconductor devices continue to be improved upon for performance, cost, and reliability, it is the interconnection or packaging of these devices that will limit the performance of the systems. Technology must develop packaging for transistor chips, with high levels of performance and integration providing cooling, power, and interconnection, and yet pre

  13. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  14. Thermal-hydraulic software development for nuclear waste transportation cask design and analysis

    International Nuclear Information System (INIS)

    Brown, N.N.; Burns, S.P.; Gianoulakis, S.E.; Klein, D.E.

    1991-01-01

    This paper describes the development of a state-of-the-art thermal-hydraulic software package intended for spent fuel and high-level nuclear waste transportation cask design and analysis. The objectives of this software development effort are threefold: (1) to take advantage of advancements in computer hardware and software to provide a more efficient user interface, (2) to provide a tool for reducing inefficient conservatism in spent fuel and high-level waste shipping cask design by including convection as well as conduction and radiation heat transfer modeling capabilities, and (3) to provide a thermal-hydraulic analysis package which is developed under a rigorous quality assurance program established at Sandia National Laboratories. 20 refs., 5 figs., 2 tabs

  15. The evolution of CMS software performance studies

    CERN Document Server

    Kortelainen, Matti J

    2010-01-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  16. The evolution of CMS software performance studies

    Science.gov (United States)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  17. Daylighting simulation : comparison of softwares for architect's utilization

    Energy Technology Data Exchange (ETDEWEB)

    Christakou, D.E.; Amorim, C.N.D. [Brazil Univ., Brasilia (Brazil). Faculty of Architecture and Urbanism

    2005-07-01

    This study analyzed and compared 4 daylighting software packages to determine the primary benefits and limits of each one, while considering the priorities for the use of the software by architects. The complex task of daylight simulation is an important step in designing buildings, particularly when the main objective is comfort and energy conservation. Simulation is not yet commonly practiced by professional architects because of the complexities of various software packages, the lack of user friendly interfaces and difficulty in interpreting results. The 4 software packages that were evaluated in this study were: (1) Desktop Radiance, (2) Rayfront, (3) Relux 2004 Vision, and (4) Lightscape. Criteria such as interfaces, flexibility, and help manuals were also analyzed in an effort to establish a frame of the main points to be considered when choosing daylighting software for architectural use, both in educational and office environments. Simulations of a test room were performed in which some parameters were modified to verify the performance of the following main criteria: flexibility in adapting to the architect's workflow; the use of state of the art algorithms; numerical precision; and, access possibility by Brazilian architects. The results demonstrate the potential of software's improvement, particular in terms of user interfaces and help manuals. The study showed that Relux 2004 Vision is the most adequate for architect's use. Rayfront and Desktop Radiance presented more difficulties in the design process, but Desktop Radiance had the advantage of being enclosed in AUTOCAD, a well known interface. Lightscape had a user friendly interface but was not as intuitive as Relux. It was concluded that the ideal daylighting simulation software does not yet exist. The ideal software should integrate diverse factors and combine edition and modeling tools beyond luminous evaluation and thermal consequences of daylight use. 5 refs., 3 tabs., 4 figs.

  18. Hanford high-level waste melter system evaluation data packages

    International Nuclear Information System (INIS)

    Elliott, M.L.; Shafer, P.J.; Lamar, D.A.; Merrill, R.A.; Grunewald, W.; Roth, G.; Tobie, W.

    1996-03-01

    The Tank Waste Remediation System is selecting a reference melter system for the Hanford High-Level Waste vitrification plant. A melter evaluation was conducted in FY 1994 to narrow down the long list of potential melter technologies to a few for testing. A formal evaluation was performed by a Melter Selection Working Group (MSWG), which met in June and August 1994. At the June meeting, MSWG evaluated 15 technologies and selected six for more thorough evaluation at the Aug. meeting. All 6 were variations of joule-heated or induction-heated melters. Between the June and August meetings, Hanford site staff and consultants compiled data packages for each of the six melter technologies as well as variants of the baseline technologies. Information was solicited from melter candidate vendors to supplement existing information. This document contains the data packages compiled to provide background information to MSWG in support of the evaluation of the six technologies. (A separate evaluation was performed by Fluor Daniel, Inc. to identify balance of plant impacts if a given melter system was selected.)

  19. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  20. Dose - a software package for the calculation of integrated exposure resulting from an accident in a nuclear power plant

    International Nuclear Information System (INIS)

    Doron, E.; Ohaion, H.; Asculai, E.

    1985-05-01

    A software package intended for the assessments of risks resulting from accidental release of radioactive materials from a nuclear power plant is presented. The models and the various programs based on them, are described. The work includes detailed operating instructions for the various programs, as well as instructions for the preparation of the necessary input data. Various options are described for additions and changes to the programs with the aim of extending their usefulness to more general cases from the aspects of meteorology and pollution sources. finally, a sample calculation that enables the user to test the proper functioning of the whole package, as well as his own proficiency in its use, is given. (author)

  1. Testing on a Large Scale Running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Höcker, A; Hughes-Jones, R E; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Leahu, L; Leahu, M; Lehmann-Miotto, G; Le Vine, M J; Liu, W; Maeno, T; Männer, R; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Müller, M; Garcia-Murillo, R; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Albuquerque-Portes, M; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Sole-Segura, E; Seixas, M; Sloper, J; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Ünel, G; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; von der Schmitt, H; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  2. Testing on a Large Scale running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Albuquerque-Portes, M; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garcia-Murillo, R; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Hughes-Jones, R E; Höcker, A; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Le Vine, M J; Leahu, L; Leahu, M; Lehmann-Miotto, G; Liu, W; Maeno, T; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Männer, R; Müller, M; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Seixas, M; Sloper, J; Sole-Segura, E; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; von der Schmitt, H; Ünel, G; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  3. SIMSAS - a window based software package for simulation and analysis of multiple small-angle scattering data

    International Nuclear Information System (INIS)

    Jayaswal, B.; Mazumder, S.

    1998-09-01

    Small-angle scattering data from strong scattering systems, e.g. porous materials, cannot be analysed invoking single scattering approximation as specimen needed to replicate the bulk matrix in essential properties are too thick to validate the approximation. The presence of multiple scattering is indicated by invalidity of the functional invariance property of the observed scattering profile with variation of sample thickness and/or wave length of the probing radiation. This article delineates how non accounting of multiple scattering affects the results of analysis and then how to correct the data for its effect. It deals with an algorithm to extract single scattering profile from small-angle scattering data affected by multiple scattering. The algorithm can process the scattering data and deduce single scattering profile in absolute scale. A software package, SIMSAS, is introduced for executing this inversion step. This package is useful both to simulate and to analyse multiple small-angle scattering data. (author)

  4. A software program for exchanging MR data

    DEFF Research Database (Denmark)

    Ring, P B; Jensen, J A; Henriksen, O

    1993-01-01

    of digital MR images of the human brain. Because there was no common data format, software package was developed for data exchange. This article describes the basic features of the developed software. The software package was written in the language of C and was successfully tested on an IBM-6150 UNIX...... workstation. The software is currently being tested on the following series of UNIX workstations: SUN SPARC, IBM RS6000, and HP 9000/700....

  5. Nuclear-waste-package program for high-level isolation in Nevada tuff

    International Nuclear Information System (INIS)

    Rothman, A.J.

    1982-01-01

    The objective of the waste package program is to insure that a package is designed suitable for a repository in tuff that meets performance requirements of the NRC. In brief, the current (draft) regulation requires that the radionuclides be contained in the engineered system for 1000 years, and that, thereafter, no more than one part in 10 5 of the nuclides per year leave the boundary of the system. Studies completed as of this writing are thermal modeling of waste packages in a tuff repository and analysis of sodium bentonite as a potential backfill material. Both studies will be presented. Thermal calculations coupled with analysis of the geochemical literature on bentonite indicate that extensive chemical and physical alteration of bentonite would result at the high power densities proposed (ca. 2 kW/package and an area density of 25 W/m 2 ), in part due to compacted bentonite's relatively low thermal conductivity when dehydrated (approx. 0.6 +- 0.2 W/m 0 C). Because our groundwater contains K + , an upper hydrothermal temperature limit appears to be 120 to 150 0 C. At much lower power densities (less than 1 kW per package and an areal density of 12 W/m 2 ), bentonite may be suitable

  6. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  7. METEOR v1.0 - Design and structure of the software package; METEOR v1.0 - Estructura y modulos informaticos

    Energy Technology Data Exchange (ETDEWEB)

    Palomo, E.

    1994-07-01

    This script describes the structure and the separated modules of the software package METEOR for the statistical analysis of meteorological data series. It contains a systematic description of the subroutines of METEOR and, also, of the required shape for input and output files. The original version of METEOR have been developed by Ph.D. Elena Palomo, CIEMAT-IER, GIMASE. It is built by linking programs and routines written in FORTRAN 77 and it adds thc graphical capabilities of GNUPLOT. The shape of this toolbox was designed following the criteria of modularity, flexibility and agility criteria. All the input, output and analysis options are structured in three main menus: i) the first is aimed to evaluate the quality of the data set; ii) the second is aimed for pre-processing of the data; and iii) the third is aimed towards the statistical analyses and for creating the graphical outputs. Actually the information about METEOR is constituted by three documents written in spanish: 1) METEOR v1.0: User's guide; 2) METEOR v1.0: A usage example; 3) METEOR v 1.0: Design and structure of the software package. (Author)

  8. Effect of Functional diversity on Software Performance

    OpenAIRE

    Viswanatha Rao, Balajee

    2011-01-01

    For the past few decades, there has been numerous literature produced on functional diversity and performance. However, the relationship between functional diversity and performance in software industry is clearly not explained and results are found to be inconsistent. The main focus of this research is to explore the effects of functional diversity on software project performance by conducting a qualitative study. Four metrics were chosen from literature namely decision making, creativity an...

  9. Data simulation in machine olfaction with the R package chemosensors.

    Directory of Open Access Journals (Sweden)

    Andrey Ziyatdinov

    Full Text Available In machine olfaction, the design of applications based on gas sensor arrays is highly dependent on the robustness of the signal and data processing algorithms. While the practice of testing the algorithms on public benchmarks is not common in the field, we propose software for performing data simulations in the machine olfaction field by generating parameterized sensor array data. The software is implemented as an R language package chemosensors which is open-access, platform-independent and self-contained. We introduce the concept of a virtual sensor array which can be used as a data generation tool. In this work, we describe the data simulation workflow which basically consists of scenario definition, virtual array parameterization and the generation of sensor array data. We also give examples of the processing of the simulated data as proof of concept for the parameterized sensor array data: the benchmarking of classification algorithms, the evaluation of linear- and non-linear regression algorithms, and the biologically inspired processing of sensor array data. All the results presented were obtained under version 0.7.6 of the chemosensors package whose home page is chemosensors.r-forge.r-project.org.

  10. A NEW EXHAUST VENTILATION SYSTEM DESIGN SOFTWARE

    Directory of Open Access Journals (Sweden)

    H. Asilian Mahabady

    2007-09-01

    Full Text Available A Microsoft Windows based ventilation software package is developed to reduce time-consuming and boring procedure of exhaust ventilation system design. This program Assure accurate and reliable air pollution control related calculations. Herein, package is tentatively named Exhaust Ventilation Design Software which is developed in VB6 programming environment. Most important features of Exhaust Ventilation Design Software that are ignored in formerly developed packages are Collector design and fan dimension data calculations. Automatic system balance is another feature of this package. Exhaust Ventilation Design Software algorithm for design is based on two methods: Balance by design (Static pressure balance and design by Blast gate. The most important section of software is a spreadsheet that is designed based on American Conference of Governmental Industrial Hygienists calculation sheets. Exhaust Ventilation Design Software is developed so that engineers familiar with American Conference of Governmental Industrial Hygienists datasheet can easily employ it for ventilation systems design. Other sections include Collector design section (settling chamber, cyclone, and packed tower, fan geometry and dimension data section, a unit converter section (that helps engineers to deal with units, a hood design section and a Persian HTML help. Psychometric correction is also considered in Exhaust Ventilation Design Software. In Exhaust Ventilation Design Software design process, efforts are focused on improving GUI (graphical user interface and use of programming standards in software design. Reliability of software has been evaluated and results show acceptable accuracy.

  11. High-activity liquid packaging design criteria

    International Nuclear Information System (INIS)

    1994-05-01

    In recent studies, it has been acknowledged that there is an emerging need for packaging to transport high-activity liquid off the Hanford Site to support characterization and process development activities of liquid waste stored in underground tanks. These studies have dealt with specimen testing needs primarily at the Hanford Site; however, similar needs appear to be developing at other US Department of Energy (DOE) sites. The need to ship single and multiple specimens to offsite laboratories is anticipated because it is predicted that onsite laboratories will be overwhelmed by an increasing number and size (volume) of samples. Potentially, the specimen size could range from 250 mL to greater than 50 L. Presently, no certified Type-B packagings are available for transport of high-activity liquid radioactive specimens in sizes to support Site missions

  12. Software Design Document for the AMP Nuclear Fuel Performance Code

    International Nuclear Information System (INIS)

    Philip, Bobby; Clarno, Kevin T.; Cochran, Bill

    2010-01-01

    The purpose of this document is to describe the design of the AMP nuclear fuel performance code. It provides an overview of the decomposition into separable components, an overview of what those components will do, and the strategic basis for the design. The primary components of a computational physics code include a user interface, physics packages, material properties, mathematics solvers, and computational infrastructure. Some capability from established off-the-shelf (OTS) packages will be leveraged in the development of AMP, but the primary physics components will be entirely new. The material properties required by these physics operators include many highly non-linear properties, which will be replicated from FRAPCON and LIFE where applicable, as well as some computationally-intensive operations, such as gap conductance, which depends upon the plenum pressure. Because there is extensive capability in off-the-shelf leadership class computational solvers, AMP will leverage the Trilinos, PETSc, and SUNDIALS packages. The computational infrastructure includes a build system, mesh database, and other building blocks of a computational physics package. The user interface will be developed through a collaborative effort with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Capability Transfer program element as much as possible and will be discussed in detail in a future document.

  13. MIDAS: Software for the detection and analysis of lunar impact flashes

    Science.gov (United States)

    Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús

    2015-06-01

    Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.

  14. CheMentor Software System by H. A. Peoples

    Science.gov (United States)

    Reid, Brian P.

    1997-09-01

    CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.

  15. Constraint Network Analysis (CNA): a Python software package for efficiently linking biomacromolecular structure, flexibility, (thermo-)stability, and function.

    Science.gov (United States)

    Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger

    2013-04-22

    For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.

  16. Data analysis for the LISA Technology Package

    Energy Technology Data Exchange (ETDEWEB)

    Hewitson, M; Danzmann, K; Diepholz, I; GarcIa, A [Albert-Einstein-Institut, Max-Planck-Institut fuer Gravitationsphysik und Universitaet Hannover, 30167 Hannover (Germany); Armano, M; Fauste, J [European Space Agency, ESAC, Villanueva de la Canada, 28692 Madrid (Spain); Benedetti, M [Dipartimento di Ingegneria dei Materiali e Tecnologie Industriali, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Bogenstahl, J [Department of Physics and Astronomy, University of Glasgow, Glasgow (United Kingdom); Bortoluzzi, D; Bosetti, P; Cristofolini, I [Dipartimento di Ingegneria Meccanica e Strutturale, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Brandt, N [Astrium GmbH, 88039 Friedrichshafen (Germany); Cavalleri, A; Ciani, G; Dolesi, R; Ferraioli, L [Dipartimento di Fisica, Universita di Trento and INFN, Gruppo Collegato di Trento, 38050 Povo, Trento (Italy); Cruise, M [Department of Physics and Astronomy, University of Birmingham, Birmingham (United Kingdom); Fertin, D; GarcIa, C [European Space Agency, ESTEC, 2200 AG Noordwijk (Netherlands); Fichter, W, E-mail: martin.hewitson@aei.mpg.d [Institut fuer Flugmechanik und Flugregelung, 70569 Stuttgart (Germany)

    2009-05-07

    The LISA Technology Package (LTP) on board the LISA Pathfinder mission aims to demonstrate some key concepts for LISA which cannot be tested on ground. The mission consists of a series of preplanned experimental runs. The data analysis for each experiment must be designed in advance of the mission. During the mission, the analysis must be carried out promptly so that the results can be fed forward into subsequent experiments. As such a robust and flexible data analysis environment needs to be put in place. Since this software is used during mission operations and effects the mission timeline, it must be very robust and tested to a high degree. This paper presents the requirements, design and implementation of the data analysis environment (LTPDA) that will be used for analysing the data from LTP. The use of the analysis software to perform mock data challenges (MDC) is also discussed, and some highlights from the first MDC are presented.

  17. Data analysis for the LISA Technology Package

    International Nuclear Information System (INIS)

    Hewitson, M; Danzmann, K; Diepholz, I; GarcIa, A; Armano, M; Fauste, J; Benedetti, M; Bogenstahl, J; Bortoluzzi, D; Bosetti, P; Cristofolini, I; Brandt, N; Cavalleri, A; Ciani, G; Dolesi, R; Ferraioli, L; Cruise, M; Fertin, D; GarcIa, C; Fichter, W

    2009-01-01

    The LISA Technology Package (LTP) on board the LISA Pathfinder mission aims to demonstrate some key concepts for LISA which cannot be tested on ground. The mission consists of a series of preplanned experimental runs. The data analysis for each experiment must be designed in advance of the mission. During the mission, the analysis must be carried out promptly so that the results can be fed forward into subsequent experiments. As such a robust and flexible data analysis environment needs to be put in place. Since this software is used during mission operations and effects the mission timeline, it must be very robust and tested to a high degree. This paper presents the requirements, design and implementation of the data analysis environment (LTPDA) that will be used for analysing the data from LTP. The use of the analysis software to perform mock data challenges (MDC) is also discussed, and some highlights from the first MDC are presented.

  18. CH Packaging Operations for High Wattage Waste at LANL

    International Nuclear Information System (INIS)

    Washington TRU Solutions LLC

    2002-01-01

    This procedure provides instructions for assembling the following contact-handled (CH) packaging payloads: - Drum payload assembly - Standard Waste Box (SWB) assembly - Ten-Drum Overpack (TDOP) In addition, this procedure provides operating instructions for the TRUPACT-II CH waste packaging. This document also provides instructions for performing ICV and OCV preshipment leakage rate tests on the following packaging seals, using a nondestructive helium (He) leak test: - ICV upper main O-ring seal - ICV outer vent port plug O-ring seal - OCV upper main O-ring seal - OCV vent port plug O-ring seal

  19. CH Packaging Operations for High Wattage Waste at LANL

    International Nuclear Information System (INIS)

    Washington TRU Solutions LLC

    2002-01-01

    This procedure provides instructions for assembling the following contact-handled (CH) packaging payloads: - Drum payload assembly - Standard Waste Box (SWB) assembly - Ten-Drum Overpack (TDOP) In addition, this procedure also provides operating instructions for the TRUPACT-II CH waste packaging. This document also provides instructions for performing ICV and OCV preshipment leakage rate tests on the following packaging seals, using a nondestructive helium (He) leak test: - ICV upper main O-ring seal - ICV outer vent port plug O-ring seal - OCV upper main O-ring seal - OCV vent port plug O-ring seal

  20. CH Packaging Operations for High Wattage Waste at LANL

    International Nuclear Information System (INIS)

    Washington TRU Solutions LLC

    2003-01-01

    This procedure provides instructions for assembling the following contact-handled (CH) packaging payloads: - Drum payload assembly - Standard Waste Box (SWB) assembly - Ten-Drum Overpack (TDOP) In addition, this procedure also provides operating instructions for the TRUPACT-II CH waste packaging. This document also provides instructions for performing ICV and OCV preshipment leakage rate tests on the following packaging seals, using a nondestructive helium (He) leak test: - ICV upper main O-ring seal - ICV outer vent port plug O-ring seal - OCV upper main O-ring seal - OCV vent port plug O-ring seal

  1. 500 C Electronic Packaging and Dielectric Materials for High Temperature Applications

    Science.gov (United States)

    Chen, Liang-yu; Neudeck, Philip G.; Spry, David J.; Beheim, Glenn M.; Hunter, Gary W.

    2016-01-01

    High-temperature environment operable sensors and electronics are required for exploring the inner solar planets and distributed control of next generation aeronautical engines. Various silicon carbide (SiC) high temperature sensors, actuators, and electronics have been demonstrated at and above 500C. A compatible packaging system is essential for long-term testing and application of high temperature electronics and sensors. High temperature passive components are also necessary for high temperature electronic systems. This talk will discuss ceramic packaging systems developed for high temperature electronics, and related testing results of SiC circuits at 500C and silicon-on-insulator (SOI) integrated circuits at temperatures beyond commercial limit facilitated by these high temperature packaging technologies. Dielectric materials for high temperature multilayers capacitors will also be discussed. High-temperature environment operable sensors and electronics are required for probing the inner solar planets and distributed control of next generation aeronautical engines. Various silicon carbide (SiC) high temperature sensors, actuators, and electronics have been demonstrated at and above 500C. A compatible packaging system is essential for long-term testing and eventual applications of high temperature electronics and sensors. High temperature passive components are also necessary for high temperature electronic systems. This talk will discuss ceramic packaging systems developed for high electronics and related testing results of SiC circuits at 500C and silicon-on-insulator (SOI) integrated circuits at temperatures beyond commercial limit facilitated by high temperature packaging technologies. Dielectric materials for high temperature multilayers capacitors will also be discussed.

  2. Safety Analysis Report - Packages, 9965, 9968, 9972-9975 Packages

    International Nuclear Information System (INIS)

    Blanton, P.

    2000-01-01

    This Safety Analysis Report for Packaging (SARP) documents the analysis and testing performed on four type B Packages: the 9972, 9973, 9974, and 9975 packages. Because all four packages have similar designs with very similar performance characteristics, all of them are presented in a single SARP. The performance evaluation presented in this SARP documents the compliance of the 9975 package with the regulatory safety requirements. Evaluations of the 9972, 9973, and 9974 packages support that of the 9975. To avoid confusion arising from the inclusion of four packages in a single document, the text segregates the data for each package in such a way that the reader interested in only one package can progress from Chapter 1 through Chapter 9. The directory at the beginning of each chapter identifies each section that should be read for a given package. Sections marked ''all'' are generic to all packages

  3. TENSOLVE: A software package for solving systems of nonlinear equations and nonlinear least squares problems using tensor methods

    Energy Technology Data Exchange (ETDEWEB)

    Bouaricha, A. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.; Schnabel, R.B. [Colorado Univ., Boulder, CO (United States). Dept. of Computer Science

    1996-12-31

    This paper describes a modular software package for solving systems of nonlinear equations and nonlinear least squares problems, using a new class of methods called tensor methods. It is intended for small to medium-sized problems, say with up to 100 equations and unknowns, in cases where it is reasonable to calculate the Jacobian matrix or approximate it by finite differences at each iteration. The software allows the user to select between a tensor method and a standard method based upon a linear model. The tensor method models F({ital x}) by a quadratic model, where the second-order term is chosen so that the model is hardly more expensive to form, store, or solve than the standard linear model. Moreover, the software provides two different global strategies, a line search and a two- dimensional trust region approach. Test results indicate that, in general, tensor methods are significantly more efficient and robust than standard methods on small and medium-sized problems in iterations and function evaluations.

  4. Decal electronics for printed high performance cmos electronic systems

    KAUST Repository

    Hussain, Muhammad Mustafa

    2017-11-23

    High performance complementary metal oxide semiconductor (CMOS) electronics are critical for any full-fledged electronic system. However, state-of-the-art CMOS electronics are rigid and bulky making them unusable for flexible electronic applications. While there exist bulk material reduction methods to flex them, such thinned CMOS electronics are fragile and vulnerable to handling for high throughput manufacturing. Here, we show a fusion of a CMOS technology compatible fabrication process for flexible CMOS electronics, with inkjet and conductive cellulose based interconnects, followed by additive manufacturing (i.e. 3D printing based packaging) and finally roll-to-roll printing of packaged decal electronics (thin film transistors based circuit components and sensors) focusing on printed high performance flexible electronic systems. This work provides the most pragmatic route for packaged flexible electronic systems for wide ranging applications.

  5. Study on the BES Ⅲ offline software performance

    International Nuclear Information System (INIS)

    Zhang Xiaomei; Sun Gongxing

    2011-01-01

    Performance monitor and analysis on the BESⅢ offline software system is very useful for the software optimization and the improvement of CPU and memory usage. It presented a feasible performance monitoring service based on GAUDI, and carried out performance tests and analysis on the BESⅢ simulation and reconstruction with the service. (authors)

  6. Large Scale Software Building with CMake in ATLAS

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  7. Mold Heating and Cooling Pump Package Operator Interface Controls Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Josh A. Salmond

    2009-08-07

    The modernization of the Mold Heating and Cooling Pump Package Operator Interface (MHC PP OI) consisted of upgrading the antiquated single board computer with a proprietary operating system to off-the-shelf hardware and off-the-shelf software with customizable software options. The pump package is the machine interface between a central heating and cooling system that pumps heat transfer fluid through an injection or compression mold base on a local plastic molding machine. The operator interface provides the intelligent means of controlling this pumping process. Strict temperature control of a mold allows the production of high quality parts with tight tolerances and low residual stresses. The products fabricated are used on multiple programs.

  8. OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Greiner, Annette; Cholia, Shreyas; Louie, Katherine; Bethel, E. Wes; Northen, Trent R.; Bowen, Benjamin P.

    2013-10-02

    Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data access (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.

  9. SpcAudace: Spectroscopic processing and analysis package of Audela software

    Science.gov (United States)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  10. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Brun, R.; Couet, O.; Vandoni, C.E.; Zanarini, P.

    1990-01-01

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  11. A Lightweight, High-performance I/O Management Package for Data-intensive Computing

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jun

    2011-06-22

    Our group has been working with ANL collaborators on the topic bridging the gap between parallel file system and local file system during the course of this project period. We visited Argonne National Lab -- Dr. Robert Ross's group for one week in the past summer 2007. We looked over our current project progress and planned the activities for the incoming years 2008-09. The PI met Dr. Robert Ross several times such as HEC FSIO workshop 08, SC08 and SC10. We explored the opportunities to develop a production system by leveraging our current prototype to (SOGP+PVFS) a new PVFS version. We delivered SOGP+PVFS codes to ANL PVFS2 group in 2008.We also talked about exploring a potential project on developing new parallel programming models and runtime systems for data-intensive scalable computing (DISC). The methodology is to evolve MPI towards DISC by incorporating some functions of Google MapReduce parallel programming model. More recently, we are together exploring how to leverage existing works to perform (1) coordination/aggregation of local I/O operations prior to movement over the WAN, (2) efficient bulk data movement over the WAN, (3) latency hiding techniques for latency-intensive operations. Since 2009, we start applying Hadoop/MapReduce to some HEC applications with LANL scientists John Bent and Salman Habib. Another on-going work is to improve checkpoint performance at I/O forwarding Layer for the Road Runner super computer with James Nuetz and Gary Gridder at LANL. Two senior undergraduates from our research group did summer internships about high-performance file and storage system projects in LANL since 2008 for consecutive three years. Both of them are now pursuing Ph.D. degree in our group and will be 4th year in the PhD program in Fall 2011 and go to LANL to advance two above-mentioned works during this winter break. Since 2009, we have been collaborating with several computer scientists (Gary Grider, John bent, Parks Fields, James Nunez, Hsing

  12. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  13. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  14. Performance Evaluation of Software Routers with VPN Features

    Directory of Open Access Journals (Sweden)

    H. Redžović

    2017-11-01

    Full Text Available This paper presents implementation and analysis of the VPN software router which is based on Quagga and strongSwan open-source software tools. We validated the functionalities of strongSwan and Quagga in realistic environment which include scenarios with link failures. Also, we measured and analyzed the performance of encryption and hash algorithms supported by strongSwan software, in order to advise an optimal VPN configuration that provides the best performance.

  15. Efficient Calculation of Exact Exchange Within the Quantum Espresso Software Package

    Science.gov (United States)

    Barnes, Taylor; Kurth, Thorsten; Carrier, Pierre; Wichmann, Nathan; Prendergast, David; Kent, Paul; Deslippe, Jack

    Accurate simulation of condensed matter at the nanoscale requires careful treatment of the exchange interaction between electrons. In the context of plane-wave DFT, these interactions are typically represented through the use of approximate functionals. Greater accuracy can often be obtained through the use of functionals that incorporate some fraction of exact exchange; however, evaluation of the exact exchange potential is often prohibitively expensive. We present an improved algorithm for the parallel computation of exact exchange in Quantum Espresso, an open-source software package for plane-wave DFT simulation. Through the use of aggressive load balancing and on-the-fly transformation of internal data structures, our code exhibits speedups of approximately an order of magnitude for practical calculations. Additional optimizations are presented targeting the many-core Intel Xeon-Phi ``Knights Landing'' architecture, which largely powers NERSC's new Cori system. We demonstrate the successful application of the code to difficult problems, including simulation of water at a platinum interface and computation of the X-ray absorption spectra of transition metal oxides.

  16. Performance testing of LiDAR exploitation software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-04-01

    Mobile LiDAR systems are being used widely in recent years for many applications in the field of geoscience. One of most important limitations of this technology is the large computational requirements involved in data processing. Several software solutions for data processing are available in the market, but users are often unknown about the methodologies to verify their performance accurately. In this work a methodology for LiDAR software performance testing is presented and six different suites are studied: QT Modeler, AutoCAD Civil 3D, Mars 7, Fledermaus, Carlson and TopoDOT (all of them in x64). Results depict as QTModeler, TopoDOT and AutoCAD Civil 3D allow the loading of large datasets, while Fledermaus, Mars7 and Carlson do not achieve these powerful performance. AutoCAD Civil 3D needs large loading time in comparison with the most powerful softwares such as QTModeler and TopoDOT. Carlson suite depicts the poorest results among all the softwares under study, where point clouds larger than 5 million points cannot be loaded and loading time is very large in comparison with the other suites even for the smaller datasets. AutoCAD Civil 3D, Carlson and TopoDOT show more threads than other softwares like QTModeler, Mars7 and Fledermaus.

  17. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  18. Towards a Theory of Affect and Software Developers' Performance

    OpenAIRE

    Graziotin, Daniel

    2016-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people. The underlying assumption seems to be that "happy and satisfied software developers perform better". More specifically, affects-emotions and moods-have an impact on cognitive activities and the working performance of individuals. Development tasks are undertaken heavily through cognitive processes, yet software engineering research (SE) lacks theo...

  19. Software package to automate the design and production of translucent building structures made of pvc

    Directory of Open Access Journals (Sweden)

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  20. Acoustic performance design and optimal allocation of sound package in ship cabin noise reduction

    Directory of Open Access Journals (Sweden)

    YANG Deqing

    2017-08-01

    Full Text Available The sound package in noise reduction design of ship cabins has become the main approach for the future. The sound package is a specially designed acoustic component consisting of damping materials, absorption materials, sound isolation materials and base structural materials which can achieve the prescribed performance of noise reduction. Based on the Statistical Energy Analysis(SEAmethod, quick evaluation and design methods, and the optimal allocation theory of sound packages are investigated. The standard numerical acoustic performance evaluation model, sound package optimization design model and sound package optimal allocation model are presented. A genetic algorithm is applied to solve the presented optimization problems. Design examples demonstrate the validity and efficiency of the proposed models and solutions. The presented theory and methods benefit the standardization and programming of sound package design, and decrease noise reduction costs.

  1. PmagPy: Software Package for Paleomagnetic Data Analysis and Gateway to the Magnetics Information Consortium (MagIC) Database

    Science.gov (United States)

    Jonestrask, L.; Tauxe, L.; Shaar, R.; Jarboe, N.; Minnett, R.; Koppers, A. A. P.

    2014-12-01

    There are many data types and methods of analysis in rock and paleomagnetic investigations. The MagIC database (http://earthref.org/MAGIC) was designed to accommodate the vast majority of data used in such investigations. Yet getting data from the laboratory into the database, and visualizing and re-analyzing data downloaded from the database, makes special demands on data formatting. There are several recently published programming packages that deal with single types of data: demagnetization experiments (e.g., Lurcock et al., 2012), paleointensity experiments (e.g., Leonhardt et al., 2004), and FORC diagrams (e.g., Harrison et al., 2008). However, there is a need for a unified set of open source, cross-platform software that deals with the great variety of data types in a consistent way and facilitates importing data into the MagIC format, analyzing them and uploading them into the MagIC database. The PmagPy software package (http://earthref.org/PmagPy/cookbook/) comprises a such a comprehensive set of tools. It facilitates conversion of many laboratory formats into the common MagIC format and allows interpretation of demagnetization and Thellier-type experimental data. With some 175 programs and over 250 functions, it can be used to create a wide variety of plots and allows manipulation of downloaded data sets as well as preparation of new contributions for uploading to the MagIC database.

  2. Software tools for quantification of X-ray microtomography at the UGCT

    Energy Technology Data Exchange (ETDEWEB)

    Vlassenbroeck, J. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium)], E-mail: jelle.vlassenbroeck@ugent.be; Dierick, M.; Masschaele, B. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Cnudde, V. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium); Van Hoorebeke, L. [Department of Subatomic and Radiation Physics, Ghent University, Proeftuinstraat 86, B-9000 Gent (Belgium); Jacobs, P. [Department of Geology and Soil Science, Ghent University, Krijgslaan 281/S8, B-9000 Gent (Belgium)

    2007-09-21

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  3. Exploring massive, genome scale datasets with the GenometriCorr package.

    Directory of Open Access Journals (Sweden)

    Alexander Favorov

    2012-05-01

    Full Text Available We have created a statistically grounded tool for determining the correlation of genomewide data with other datasets or known biological features, intended to guide biological exploration of high-dimensional datasets, rather than providing immediate answers. The software enables several biologically motivated approaches to these data and here we describe the rationale and implementation for each approach. Our models and statistics are implemented in an R package that efficiently calculates the spatial correlation between two sets of genomic intervals (data and/or annotated features, for use as a metric of functional interaction. The software handles any type of pointwise or interval data and instead of running analyses with predefined metrics, it computes the significance and direction of several types of spatial association; this is intended to suggest potentially relevant relationships between the datasets.The package, GenometriCorr, can be freely downloaded at http://genometricorr.sourceforge.net/. Installation guidelines and examples are available from the sourceforge repository. The package is pending submission to Bioconductor.

  4. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    Science.gov (United States)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  5. EXPERIMENTAL STUDIES FOR DEVELOPMENT HIGH-POWER AUDIO SPEAKER DEVICES PERFORMANCE USING PERMANENT NdFeB MAGNETS SPECIAL TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Constantin D. STĂNESCU

    2013-05-01

    Full Text Available In this paper the authors shows the research made for improving high-power audio speaker devices performance using permanent NdFeB magnets special technology. Magnetic losses inside these audio devices are due to mechanical system frictions and to thermal effect of Joules eddy currents. In this regard, by special technology, were made conical surfaces at top plate and center pin. Analysing results obtained by modelling the magnetic circuit finite element method using electronic software package,was measured increase efficiency by over 10 %, from 1,136T to13T.

  6. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  7. Performance Test of Openflow Agent on Openflow Software-Based Mikrotik RB750 Switch

    Directory of Open Access Journals (Sweden)

    Rikie Kartadie

    2016-11-01

    Full Text Available A network is usually developed by several devices such as router, switch etc. Every device forwards data package manipulation with complicated protocol planted in its hardware. An operator is responsible for running configuration either to manage rules or application applied in the network. Human error may occur when device configuration run manually by operator. Some famous vendors, one of them is MikroTik, has also been implementing this OpenFlow on its operation. It provides the implementation of SDN/OpenFlow architecture with affordable cost. The second phase research result showed that switch OF software-based MikroTik resulted higher latency value than both mininet and switch OF software-based OpenWRT. The average gap value of switch OF software-based MikroTik is 2012 kbps lower than the value of switch OF software-based OpenWRT. The average gap value of throughput bandwidth protocol UDP switch OF software-based MikroTik is 3.6176 kBps lower than switch OF software-based OpenWRT and it is 8.68 kBps lower than mininet. The average gap throughput jitter protokol UDP of switch OF software-based MiktoTik is 0.0103ms lower than switch OF software-based OpenWRT and 0.0093ms lower than mininet. 

  8. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  9. Certification of packagings: compliance with DOT specification 7A packaging requirements

    International Nuclear Information System (INIS)

    Edling, D.A.

    1976-01-01

    A study was conducted to determine which of the packagings currently listed in CFR 49 Section 173.395 a.1-5, meet the Specification 7A requirements (CFR 49 Section 173.350). According to DOT HM-111 the present listing of various authorized DOT specifications in Section 173.394 and Section 173.395 (Type A containers) of ICC Tariff No. 27 would be deleted with complete reliance being placed on the use of DOT 7A, Type A general packaging specification. Each user of a Specification 7A package would be required to document and maintain on file for one year a written record of his determination of compliance with the DOT Specification 7A performance requirements. All the specification packagings listed in CFR 49 Section 173.395a.1-5 were tested and shown to meet the Specification 7A criteria; however, in many cases qualifications were placed on their use. Forty-nine specification packagings were tested and shown to meet the DOT Specification 7A performance requirements and since there were several styles of some specific packagings, this amounts to greater than 80 packagings. The extensive testing generally indicated a high degree of containment integrity in the packagings tested and the documentation discussed is a valuable tool for shippers of Type A quantities of radioactive materials

  10. US Army Radiological Bioassay and Dosimetry: The RBD software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.; Ward, R.C.; Maddox, L.B.

    1993-01-01

    The RBD (Radiological Bioassay and Dosimetry) software package was developed for the U. S. Army Material Command, Arlington, Virginia, to demonstrate compliance with the radiation protection guidance 10 CFR Part 20 (ref. 1). Designed to be run interactively on an IBM-compatible personal computer, RBD consists of a data base module to manage bioassay data and a computational module that incorporates algorithms for estimating radionuclide intake from either acute or chronic exposures based on measurement of the worker's rate of excretion of the radionuclide or the retained activity in the body. In estimating the intake,RBD uses a separate file for each radionuclide containing parametric representations of the retention and excretion functions. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent. For a given nuclide, if measurements exist for more than one type of assay, an auxiliary module, REPORT, estimates the intake by applying weights assigned in the nuclide file for each assay. Bioassay data and computed results (estimates of intake and committed dose equivalent) are stored in separate data bases, and the bioassay measurements used to compute a given result can be identified. The REPORT module creates a file containing committed effective dose equivalent for each individual that can be combined with the individual's external exposure

  11. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  12. CONRAD—A software framework for cone-beam imaging in radiology

    International Nuclear Information System (INIS)

    Maier, Andreas; Choi, Jang-Hwan; Riess, Christian; Keil, Andreas; Fahrig, Rebecca; Hofmann, Hannes G.; Berger, Martin; Fischer, Peter; Schwemmer, Chris; Wu, Haibo; Müller, Kerstin; Hornegger, Joachim

    2013-01-01

    Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform with extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and

  13. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  14. Effect of Painting Series Package on the Performances of Junior ...

    African Journals Online (AJOL)

    The study investigated the effect of Painting Series Package on the performance of Junior Secondary School Cultural and Creative Arts in Ogbomoso, Nigeria. Gender influence on the students' performances was also examined. Sample comprised 60 students drawn purposively from two secondary schools.

  15. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  16. Exploring massive, genome scale datasets with the genometricorr package

    KAUST Repository

    Favorov, Alexander; Mularoni, Loris; Cope, Leslie M.; Medvedeva, Yulia; Mironov, Andrey A.; Makeev, Vsevolod J.; Wheelan, Sarah J.

    2012-01-01

    We have created a statistically grounded tool for determining the correlation of genomewide data with other datasets or known biological features, intended to guide biological exploration of high-dimensional datasets, rather than providing immediate answers. The software enables several biologically motivated approaches to these data and here we describe the rationale and implementation for each approach. Our models and statistics are implemented in an R package that efficiently calculates the spatial correlation between two sets of genomic intervals (data and/or annotated features), for use as a metric of functional interaction. The software handles any type of pointwise or interval data and instead of running analyses with predefined metrics, it computes the significance and direction of several types of spatial association; this is intended to suggest potentially relevant relationships between the datasets. Availability and implementation: The package, GenometriCorr, can be freely downloaded at http://genometricorr.sourceforge.net/. Installation guidelines and examples are available from the sourceforge repository. The package is pending submission to Bioconductor. © 2012 Favorov et al.

  17. Exploring massive, genome scale datasets with the genometricorr package

    KAUST Repository

    Favorov, Alexander

    2012-05-31

    We have created a statistically grounded tool for determining the correlation of genomewide data with other datasets or known biological features, intended to guide biological exploration of high-dimensional datasets, rather than providing immediate answers. The software enables several biologically motivated approaches to these data and here we describe the rationale and implementation for each approach. Our models and statistics are implemented in an R package that efficiently calculates the spatial correlation between two sets of genomic intervals (data and/or annotated features), for use as a metric of functional interaction. The software handles any type of pointwise or interval data and instead of running analyses with predefined metrics, it computes the significance and direction of several types of spatial association; this is intended to suggest potentially relevant relationships between the datasets. Availability and implementation: The package, GenometriCorr, can be freely downloaded at http://genometricorr.sourceforge.net/. Installation guidelines and examples are available from the sourceforge repository. The package is pending submission to Bioconductor. © 2012 Favorov et al.

  18. Cardboard Based Packaging Materials as Renewable Thermal Insulation of Buildings: Thermal and Life Cycle Performance

    OpenAIRE

    Čekon, Miroslav; Struhala, Karel; Slávik, Richard

    2017-01-01

    Cardboard based packaging components represent a material with a significant potential of renewable exploitation in buildings. This study presents the results of thermal and environmental analysis of existing packaging materials compared with standard conventional thermal insulations. Experimental measurements were performed to identify the thermal performance of studied cardboard packaging materials. Real-size samples were experimentally tested in laboratory measurements. The thermal resi...

  19. Free, cross-platform gRaphical software

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2006-01-01

    -recursive graphical models, and models defined using the BUGS language. Today, there exists a wide range of packages to support the analysis of data using graphical models. Here, we focus on Open Source software, making it possible to extend the functionality by integrating these packages into more general tools. We...... will attempt to give an overview of the available Open Source software, with focus on the gR project. This project was launched in 2002 to make facilities in R for graphical modelling. Several R packages have been developed within the gR project both for display and analysis of graphical models...

  20. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    Science.gov (United States)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  1. The Use of Utility Accounting Software at Miami University.

    Science.gov (United States)

    Wenner, Paul

    1999-01-01

    Describes how Miami University successfully developed an accounting software package that tracked and recorded their utility usage, including examples of its graphics and reporting components. Background information examining the decision to pursue an energy management software package is included. (GR)

  2. Flexible event reconstruction software chains with the ALICE High-Level Trigger

    International Nuclear Information System (INIS)

    Ram, D; Breitner, T; Szostak, A

    2012-01-01

    The ALICE High-Level Trigger (HLT) has a large high-performance computing cluster at CERN whose main objective is to perform real-time analysis on the data generated by the ALICE experiment and scale it down to at-most 4GB/sec - which is the current maximum mass-storage bandwidth available. Data-flow in this cluster is controlled by a custom designed software framework. It consists of a set of components which can communicate with each other via a common control interface. The software framework also supports the creation of different configurations based on the detectors participating in the HLT. These configurations define a logical data processing “chain” of detector data-analysis components. Data flows through this software chain in a pipelined fashion so that several events can be processed at the same time. An instance of such a chain can run and manage a few thousand physics analysis and data-flow components. The HLT software and the configuration scheme used in the 2011 heavy-ion runs of ALICE, has been discussed in this contribution.

  3. Thermal performance of a depleted uranium shielded storage, transportation, and disposal package

    International Nuclear Information System (INIS)

    Wix, S.D.; Yoshimura, H.R.

    1994-01-01

    The US Department of Energy (DOE) is responsible for management and disposal of large quantities of depleted uranium (DU) in the DOE complex. Viable economic options for the use and eventual disposal of the material are needed. One possible option is the use of DU as shielding material for vitrified Defense High-Level Waste (DHLW) storage, transportation, and disposal packages. Use of DU as a shielding material provides the potential benefit of disposing of significant quantities of DU during the DHLW storage and disposal process. Two DU package concepts have been developed by Sandia National Laboratories. The first concept is the Storage/Disposal plus Transportation (S/D+T) package. The S/D+T package consists of two major components: a storage/disposal (S/D) container and a transportation overpack. The second concept is the S/D/T package which is an integral storage, transportation, and disposal package. The package concept considered in this analysis is the S/D+T package with seven DHLW waste canisters

  4. Effect of alpha and gamma radiation on the near-field chemistry and geochemistry of high-level waste packages

    International Nuclear Information System (INIS)

    Reed, D.T.

    1985-12-01

    Ionizing radiation can potentially alter geochemical and chemical processes in a geologic system. These effects can either enhance or reduce the performance of the waste package in a deep geologic repository. Current indications are that, in a repository located in basalt, ionizing radiation significantly affects geochemical/chemical processes but does not appear to significantly affect factors important to the long-term performance of the repository. The experimental results presented in this paper were obtained as part of an ongoing effort by the Basalt Waste Isolation Project to determine the effect of ionizing radiation on chemical and geochemical processes in the environment of the waste package. Gamma radiolysis experiments were done by subjecting samples of synthetic basalt groundwater in the presence of various waste package components (basalt/packing/low-carbon steel) to high levels of gamma radiation from a 60 Co source. Post-irradiation analysis was done on the gas, liquid, and solid components of the basalt system. The results obtained are important in evaluating waste package performance during the containment period. The effect of alpha radiation on the basalt groundwater system in the presence of waste package components is important in evaluating waste package performance during the isolation period. The experimental work in this area is in a very preliminary stage. Results from two experiments are reported. 9 refs., 4 figs., 7 tabs

  5. Performance evaluation of cassava starch-zinc nanocomposite film for tomatoes packaging

    Directory of Open Access Journals (Sweden)

    Adeshina Fadeyibi

    2017-05-01

    Full Text Available Biodegradable nanocomposite films are novel materials for food packaging because of their potential to extend the shelf life of food. In this research, the performance of cassava starch-zincnanocomposite film was evaluated for tomatoes packaging. The films were developed by casting the solutions of 24 g cassava starch, 0-2% (w/w zinc nanoparticles and 55% (w/w glycerol in plastic mould of 12 mm depth. The permeability of the films, due to water and oxygen, was investigated at 27°C and 65% relative humidity while the mechanical properties were determined by nanoindentation technique. The average thickness of the dried nanocomposite films was found to be 17±0.13 μm. The performances of films for tomatoes packaging was evaluated in comparison with low density polyethylene (LDPE; 10 μm at the temperature and period ranges of 10-27°C and 0-9 days, respectively. The quality and microbial attributes of the packaged tomatoes, including ascorbic acid, β-carotene and total coliform were analysed at an interval of 3 days. The results revealed that the water vapour permeability increased while the oxygen permeability decreased with the nanoparticles (P<0.05. The hardness, creep, elastic and plastic works, which determined the plasticity index of the film, decreased generally with the nanoparticles. The films containing 1 and 2% of the nanoparticles suppressed the growth of microorganisms and retained the quality of tomatoes than the LDPE at 27°C and day-9 of packaging (P<0.05. The results implied that the film could effectively be used for tomatoes packaging due to their lower oxygen permeability, hardness, elastic and plastic works.

  6. Software Reviews. Programs Worth a Second Look.

    Science.gov (United States)

    Schneider, Roxanne; Eiser, Leslie

    1989-01-01

    Reviewed are three computer software packages for use in middle/high school classrooms. Included are "MacWrite II," a word-processing program for MacIntosh computers; "Super Story Tree," a word-processing program for Apple and IBM computers; and "Math Blaster Mystery," for IBM, Apple, and Tandy computers. (CW)

  7. Development of a Nevada Statewide Database for Safety Analyst Software

    Science.gov (United States)

    2017-02-02

    Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...

  8. Eprints Institutional Repository Software: A Review

    Directory of Open Access Journals (Sweden)

    Mike R. Beazley

    2011-01-01

    Full Text Available Setting up an institutional repository (IR can be a daunting task. There are many software packages out there, some commercial, some open source, all of which offer different features and functionality. This article will provide some thoughts about one of these software packages: Eprints. Eprints was one of the first IR software packages to appear and has been available for 10 years. It is under continual development by its creators at the University of Southampton and the current version is v3.2.3. Eprints is open-source, meaning that anyone can download and make use of the software for free and the software can be modified however the user likes. This presents clear advantages for institutions will smaller budgets and also for institutions that have programmers on staff. Eprints requires some additional software to run: Linux, Apache, MySQL, and Perl. This software is all open-source and already present on the servers of many institutions. There is now a version of Eprints that will run on Windows servers as well, which will make the adoption of Eprints even easier for some. In brief, Eprints is an excellent choice for any institution looking to get an IR up and running quickly and easily. Installation is straightforward as is the initial configuration. Once the IR is up and running, users may upload documents and provide the necessary metadata for the records by filling out a simple web form. Embargoes on published documents are handled elegantly by the software, and the software links to the SHERPA/RoMEO database so authors can easily verify their rights regarding IR submissions. Eprints has some drawbacks, which will be discussed later in the review, but on the whole it is easy to recommend to anyone looking to start an IR. However, It is less clear that an institution with an existing IR based on another software package should migrate to Eprints.

  9. Challenges in the Packaging of MEMS

    Energy Technology Data Exchange (ETDEWEB)

    Malshe, A.P.; Singh, S.B.; Eaton, W.P.; O' Neal, C.; Brown, W.D.; Miller, W.M.

    1999-03-26

    The packaging of Micro-Electro-Mechanical Systems (MEMS) is a field of great importance to anyone using or manufacturing sensors, consumer products, or military applications. Currently much work has been done in the design and fabrication of MEMS devices but insufficient research and few publications have been completed on the packaging of these devices. This is despite the fact that packaging is a very large percentage of the total cost of MEMS devices. The main difference between IC packaging and MEMS packaging is that MEMS packaging is almost always application specific and greatly affected by its environment and packaging techniques such as die handling, die attach processes, and lid sealing. Many of these aspects are directly related to the materials used in the packaging processes. MEMS devices that are functional in wafer form can be rendered inoperable after packaging. MEMS dies must be handled only from the chip sides so features on the top surface are not damaged. This eliminates most current die pick-and-place fixtures. Die attach materials are key to MEMS packaging. Using hard die attach solders can create high stresses in the MEMS devices, which can affect their operation greatly. Low-stress epoxies can be high-outgassing, which can also affect device performance. Also, a low modulus die attach can allow the die to move during ultrasonic wirebonding resulting to low wirebond strength. Another source of residual stress is the lid sealing process. Most MEMS based sensors and devices require a hermetically sealed package. This can be done by parallel seam welding the package lid, but at the cost of further induced stress on the die. Another issue of MEMS packaging is the media compatibility of the packaged device. MEMS unlike ICS often interface with their environment, which could be high pressure or corrosive. The main conclusion we can draw about MEMS packaging is that the package affects the performance and reliability of the MEMS devices. There is a

  10. Scientific investigation plan for NNWSI WBS element 1.2.2.5.L: NNWSI waste package performance assessment: Revision 1

    International Nuclear Information System (INIS)

    Eggert, K.G.; O'Connell, W.J.; Lappa, D.A.

    1986-01-01

    Waste package performance assessment contains three broad categories of activities. These activities are: (1) development of a hydrothermal flow and transport model to test concepts to be used in establishing boundary conditions for performance calculations, and to interface EBS release calculations with total system performance calculations; (2) development of a waste package systems model to provide integrated deterministic assessments of performance and analyses of waste package designs; and (3) development of an uncertainty methodology for combination with the system model to perform probabilistic reliability and performance analysis waste package designs. The first category contains activities that aid in determining the scope of a separate, simplified set of hydrologic calculations needed to characterize the waste package environment for performance assessment calculations. The last two activity categories are directly concerned with waste package performance calculations. A rationale for each activity under these groups is presented. All of the activities of performance assessment are either code development or analyses of waste package problems

  11. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  12. Waste package performance assessment for the Yucca Mountain project

    International Nuclear Information System (INIS)

    O'Connell, W.J.; Lappa, D.A.; Thatcher, R.M.

    1989-01-01

    The authors completed a first cycle of model development from a specification to a computer program, PANDORA-1, for long-term performance assessment of waste packages. The model for one waste package at a time incorporates processes specific to the unsaturated environment at the proposed Yucca Mountain, NV, site. PANDORA-1 models the most likely processes and several modes of waste alteration and release. The development identified information needs for future models; many processes, local details, and combinations will have to be examined. Integration of ensemble performance and quantification of uncertainties are modeling steps at higher aggregation. Methodologies for these steps include sampling, which is well studied; we have focused on several open questions. The authors can now calculate the amount of variance reduction available from Latin hypercube sampling; it is a limited reduction. A new method, uncertainty analysis test-bed program compares the new with old sampling methods

  13. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.

  14. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  15. Regulatory and extra-regulatory testing to demonstrate radioactive material packaging safety

    International Nuclear Information System (INIS)

    Ammerman, D.J.

    1997-01-01

    Packages for the transportation of radioactive material must meet performance criteria to assure safety and environmental protection. The stringency of the performance criteria is based on the degree of hazard of the material being transported. Type B packages are used for transporting large quantities of radioisotopes (in terms of A 2 quantities). These packages have the most stringent performance criteria. Material with less than an A 2 quantity are transported in Type A packages. These packages have less stringent performance criteria. Transportation of LSA and SCO materials must be in open-quotes strong-tightclose quotes packages. The performance requirements for the latter packages are even less stringent. All of these package types provide a high level of safety for the material being transported. In this paper, regulatory tests that are used to demonstrate this safety will be described. The responses of various packages to these tests will be shown. In addition, the response of packages to extra-regulatory tests will be discussed. The results of these tests will be used to demonstrate the high level of safety provided to workers, the public, and the environment by packages used for the transportation of radioactive material

  16. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  17. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    Science.gov (United States)

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  18. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  19. Packaging of active fiber composites for improved sensor performance

    International Nuclear Information System (INIS)

    Melnykowycz, M; Barbezat, M; Koller, R; Brunner, A J

    2010-01-01

    Active fiber composites (AFC) composed of lead zirconate titanate (PZT) fibers embedded in an epoxy matrix and sandwiched between two interdigitated electrodes provide a thin and flexible smart material device which can act as a sensor or actuator. The thin profiles of AFC make them ideal for integration in glass or carbon fiber composite laminates. However, due to the low tensile limit of the PZT fibers, AFC can fail at strains below the tensile limit of many composites. This makes their use as a component in an active laminate design somewhat undesirable. In the current work, tensile testing of smart laminates composed of AFC integrated in glass fiber laminates was conducted to assess the effectiveness of different packaging strategies for improving AFC sensor performance at high strains relative to the tensile limit of the AFC. AFC were encased in carbon fiber, silicon, and pre-stressed carbon fiber to improve the tensile limit of the AFC when integrated in glass fiber laminates. By laminating AFC with pre-stressed carbon fiber, the tensile limit and strain sensor ability of the AFC were significantly improved. Acoustic emission monitoring was used and the results show that PZT fiber breakage was reduced due to the pre-stressed packaging process

  20. ggseqlogo: a versatile R package for drawing sequence logos.

    Science.gov (United States)

    Wagih, Omar

    2017-11-15

    Sequence logos have become a crucial visualization method for studying underlying sequence patterns in the genome. Despite this, there remains a scarcity of software packages that provide the versatility often required for such visualizations. ggseqlogo is an R package built on the ggplot2 package that aims to address this issue. ggseqlogo offers native illustration of publication-ready DNA, RNA and protein sequence logos in a highly customizable fashion with features including multi-logo plots, qualitative and quantitative colour schemes, annotation of logos and integration with other plots. The package is intuitive to use and seamlessly integrates into R analysis pipelines. ggseqlogo is released under the GNU licence and is freely available via CRAN-The Comprehensive R Archive Network https://cran.r-project.org/web/packages/ggseqlogo. A detailed tutorial can be found at https://omarwagih.github.io/ggseqlogo. wagih@ebi.ac.uk. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. RpeakChrom: Novel R package for the automated characterization and optimization of column efficiency in high-performance liquid chromatography analysis.

    Science.gov (United States)

    Peris-Díaz, Manuel David; Alcoriza-Balaguer, Maria Isabel; García-Cañaveras, Juan Carlos; Santonja, Francisco; Sentandreu, Enrique; Lahoz, Agustín

    2017-11-01

    Characterization of chromatographic columns using the traditional van Deemter method is limited by the necessity of calculating extra-column variance, issue particularly relevant when modeling asymmetrical peaks eluted from monolithic columns. A novel R package that implements Parabolic Variance Modified Gaussian approach for accurate peak modeling, van Deemter equation and two alternatives approaches, based on van Deemter, has been developed to calculate the height equivalent to a theoretical plate (HETP). To assess package capabilities conventional packed reverse-phase and monolithic HPLC columns were characterized. Peaks eluted from the monolithic column showed a high value of factor asymmetry due, in part, to the contribution of extra-column factors. Such deviation can be circumvented by the two alternatives approaches implemented in the R-package. Furthermore, increased values of eddy diffusion and mass transfer kinetics terms in HETP were observed for the packed column, while accuracy was below 9% in all cases. These results showed the usefulness of the R-package for both modeling chromatographic peaks and assessing column efficiency. The RpeakChrom package could become a helpful tool for testing new stationary phases during column development and to evaluate column during its lifetime. This R tool is freely available from CRAN (https://CRAN.R-project.org/package=RpeakChrom). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A highly miniaturized vacuum package for a trapped ion atomic clock

    Energy Technology Data Exchange (ETDEWEB)

    Schwindt, Peter D. D., E-mail: pschwin@sandia.gov; Jau, Yuan-Yu; Partner, Heather; Casias, Adrian; Wagner, Adrian R.; Moorman, Matthew; Manginell, Ronald P. [Sandia National Laboratories, Albuquerque, New Mexico 87185 (United States); Kellogg, James R.; Prestage, John D. [Jet Propulsion Laboratory, Pasadena, California 91109 (United States)

    2016-05-15

    We report on the development of a highly miniaturized vacuum package for use in an atomic clock utilizing trapped ytterbium-171 ions. The vacuum package is approximately 1 cm{sup 3} in size and contains a linear quadrupole RF Paul ion trap, miniature neutral Yb sources, and a non-evaporable getter pump. We describe the fabrication process for making the Yb sources and assembling the vacuum package. To prepare the vacuum package for ion trapping, it was evacuated, baked at a high temperature, and then back filled with a helium buffer gas. Once appropriate vacuum conditions were achieved in the package, it was sealed with a copper pinch-off and was subsequently pumped only by the non-evaporable getter. We demonstrated ion trapping in this vacuum package and the operation of an atomic clock, stabilizing a local oscillator to the 12.6 GHz hyperfine transition of {sup 171}Y b{sup +}. The fractional frequency stability of the clock was measured to be 2 × 10{sup −11}/τ{sup 1/2}.

  3. Performance-oriented packaging: A guide to identifying and designing. Identifying and designing hazardous materials packaging for compliance with post HM-181 DOT Regulations

    International Nuclear Information System (INIS)

    1994-08-01

    With the initial publication of Docket HM-181 (hereafter referred to as HM-181), the U.S. Department of Energy (DOE), Headquarters, Transportation Management Division decided to produce guidance to help the DOE community transition to performance-oriented packagings (POP). As only a few individuals were familiar with the new requirements, elementary guidance was desirable. The decision was to prepare the guidance at a level easily understood by a novice to regulatory requirements. This document identifies design development strategies for use in obtaining performance-oriented packagings that are not readily available commercially. These design development strategies will be part of the methodologies for compliance with post HM-181 U.S. Department of Transportation (DOT) packaging regulations. This information was prepared for use by the DOE and its contractors. The document provides guidance for making decisions associated with designing performance-oriented packaging, and not for identifying specific material or fabrication design details. It does provide some specific design considerations. Having a copy of the regulations handy when reading this document is recommended to permit a fuller understanding of the requirements impacting the design effort. While this document is not written for the packaging specialist, it does contain guidance important to those not familiar with the new POP requirements

  4. Controlatron Neutron Tube Test Suite Software Manual - Operation Manual (V2.2)

    CERN Document Server

    Noel, W P; Hertrich, R J; Martinez, M L; Wallace, D L

    2002-01-01

    The Controlatron Software Suite is a custom built application to perform automated testing of Controlatron neutron tubes. The software package was designed to allowing users to design tests and to run a series of test suites on a tube. The data is output to ASCII files of a pre-defined format for data analysis and viewing with the Controlatron Data Viewer Application. This manual discusses the operation of the Controlatron Test Suite Software and a brief discussion of state machine theory, as state machine is the functional basis of the software.

  5. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  6. Deriving stellar parameters with the SME software package

    Science.gov (United States)

    Piskunov, N.

    2017-09-01

    Photometry and spectroscopy are complementary tools for deriving accurate stellar parameters. Here I present one of the popular packages for stellar spectroscopy called SME with the emphasis on the latest developments and error assessment for the derived parameters.

  7. Building quality into performance and safety assessment software

    International Nuclear Information System (INIS)

    Wojciechowski, L.C.

    2011-01-01

    Quality assurance is integrated throughout the development lifecycle for performance and safety assessment software. The software used in the performance and safety assessment of a Canadian deep geological repository (DGR) follows the CSA quality assurance standard CSA-N286.7 [1], Quality Assurance of Analytical, Scientific and Design Computer Programs for Nuclear Power Plants. Quality assurance activities in this standard include tasks such as verification and inspection; however, much more is involved in producing a quality software computer program. The types of errors found with different verification methods are described. The integrated quality process ensures that defects are found and corrected as early as possible. (author)

  8. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  9. RavenDB high performance

    CERN Document Server

    Ritchie, Brian

    2013-01-01

    RavenDB High Performance is comprehensive yet concise tutorial that developers can use to.This book is for developers & software architects who are designing systems in order to achieve high performance right from the start. A basic understanding of RavenDB is recommended, but not required. While the book focuses on advanced topics, it does not assume that the reader has a great deal of prior knowledge of working with RavenDB.

  10. Design of the Jet Performance Software for the ATLAS Experiment at LHC

    CERN Document Server

    Doglioni, C; The ATLAS collaboration; Loch, P; Perez, K; Vitillo, RA

    2011-01-01

    This paper describes the design and implementation of the JetFramework, a software tool developed for the data analysis of the ATLAS experi- ment at CERN. JetFramework is based on Athena, an object oriented framework for data processing. The JetFramework Athena package im- plements a configurable data-flow graph (DFG) to represent an analysis. Each node of the graph can perform some computation on one or more particle collections in input. A standard set of nodes to retrieve, filter, sort and plot collections are provided. Users can also implement their own computation units inheriting from a generic interface. The analysis graph can be declared and configured in an Athena options file. To provide the requested flexibility to configure nodes from a configuration file, a sim- ple expression language permits to specify selection and plotting criterias. Viewing an analysis as an explicit DFG permits end-users to avoid writing code for repetitive tasks and to reuse user-defined computation units in other analysis...

  11. EQ3/6 software test and verification report 9/94

    International Nuclear Information System (INIS)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ''V and V'' report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT

  12. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  13. International performance-oriented packaging standards adopted in the united states

    International Nuclear Information System (INIS)

    McCall, D.L.

    1993-01-01

    On January 1, 1991, the U.S. Department of Transportation (DOT) initiated a transition to adopting a modified version of current international standards for packaging and transporting hazardous materials and hazardous wastes. This transition permits a 5-year phase-in period that will impact all phases of hazardous material transportation including material classification and description, packaging for shipment, and hazard communication standards. These changes are being enacted through the DOT Federal Docket HM-181, 'Performance-Oriented Packaging Standards.' These regulatory standards will have dramatic impact on nearly 5 billion tons of hazardous materials transported within the United States each year. This paper summarizes the principal elements of the new DOT regulations, the latest implementation schedule and impacts on U.S. shipping activities, and discusses outstanding issues that remain to be solved through the next 5 years. (author)

  14. The IPNS rietveld analysis software package for TOF [time-of-flight] powder diffraction data: Recent developments

    International Nuclear Information System (INIS)

    Rotella, F.J.; Richardson, J.W. Jr.

    1987-01-01

    A system of FORTRAN programs for the analysis of time-of-flight (TOF) neutron powder diffraction data via the Rietveld method at IPNS has been modified recently, making it possible to analyze data that exhibit diffraction maxima broadened due to anisotropic strain and that can be modeled by individual atomic anharmonic thermal vibrations. The observation of noncrystalline scattering in data from some powder samples has led to the development of software to fit such scattering by a function related to a radial distribution function through Fourier-filtering techniques. The ''user friendliness'' of the IPNS Rietveld package has been enhanced by the development of ''RIETVELD,'' a menu-based VAX/VMS command language routine for interactive file manipulation and program execution

  15. Linear algebra applications using Matlab software

    Directory of Open Access Journals (Sweden)

    Cornelia Victoria Anghel

    2005-10-01

    Full Text Available The paper presents two ways of special matrix generating using some functions included in the MatLab software package. The MatLab software package contains a set of functions that generate special matrixes used in the linear algebra applications and the signal processing from different activity fields. The paper presents two tipes of special matrixes that can be generated using written sintaxes in the dialog window of the MatLab software and for the command validity we need to press the Enter task. The applications presented in the paper represent eamples of numerical calculus using the MatLab software and belong to the scientific field „Computer Assisted Mathematics” thus creating the symbiosis between mathematics and informatics.

  16. Software Library for Bruker TopSpin NMR Data Files

    Energy Technology Data Exchange (ETDEWEB)

    2016-10-14

    A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.

  17. Waste package performance assessment for the Yucca Mountain Project

    International Nuclear Information System (INIS)

    O'Connell, W.J.; Lappa, D.A.; Thatcher, R.M.

    1989-02-01

    We completed a first cycle of model development from a specification to a computer program, PANDORA-1, for long-term performance assessment of waste packages. The model for one waste package at a time incorporates processes specific to the unsaturated environment at the proposed Yucca Mountain, NV, site. PANDORA-1 models the most likely processes and several modes of waste alteration and release. The development identified information needs for future models; many processes, local details, and combinations will have to be examined. Integration of ensemble performance and quantification of uncertainties are modeling steps at higher aggregation. Methodologies for these steps include sampling, which is well studied; we have focused on several open questions. We can now calculate the amount of variance reduction available from Latin hypercube sampling; it is a limited reduction. A new method, controlled sampling, provides substantial variance reduction for a broad range of model functions. An uncertainty analysis test-bed program compares the new with old sampling methods. 7 refs., 1 tab

  18. Software for Library Management: Selection and Evaluation.

    Science.gov (United States)

    Notowitz, Carol I.

    1987-01-01

    This discussion of library software packages includes guidelines for library automation with microcomputers; criteria to aid in software selection; comparison of some features of available acquisitions, circulation and overdues software; references for software reviews; additional information on microsoftware; and a directory of producers and…

  19. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  20. Software and the Scientist: Coding and Citation Practices in Geodynamics

    Science.gov (United States)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.