WorldWideScience

Sample records for high-performance software package

  1. Evaluation of high-performance computing software

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  2. High performance in software development

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  3. The Ettention software package

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  4. The Ettention software package

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  5. The Ettention software package.

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. MARS software package status

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  7. ORNL's DCAL software package

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  8. Implementation of a high performance parallel finite element micromagnetics package

    Scholz, W.; Suess, D.; Dittrich, R.; Schrefl, T.; Tsiantos, V.; Forster, H.; Fidler, J.

    2004-01-01

    A new high performance scalable parallel finite element micromagnetics package has been implemented. It includes solvers for static energy minimization, time integration of the Landau-Lifshitz-Gilbert equation, and the nudged elastic band method

  9. Nested Cohort - R software package

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  10. Packaging of control system software

    Zagar, K.; Kobal, M.; Saje, N.; Zagar, A.; Sabjan, R.; Di Maio, F.; Stepanov, D.

    2012-01-01

    Control system software consists of several parts - the core of the control system, drivers for integration of devices, configuration for user interfaces, alarm system, etc. Once the software is developed and configured, it must be installed to computers where it runs. Usually, it is installed on an operating system whose services it needs, and also in some cases dynamically links with the libraries it provides. Operating system can be quite complex itself - for example, a typical Linux distribution consists of several thousand packages. To manage this complexity, we have decided to rely on Red Hat Package Management system (RPM) to package control system software, and also ensure it is properly installed (i.e., that dependencies are also installed, and that scripts are run after installation if any additional actions need to be performed). As dozens of RPM packages need to be prepared, we are reducing the amount of effort and improving consistency between packages through a Maven-based infrastructure that assists in packaging (e.g., automated generation of RPM SPEC files, including automated identification of dependencies). So far, we have used it to package EPICS, Control System Studio (CSS) and several device drivers. We perform extensive testing on Red Hat Enterprise Linux 5.5, but we have also verified that packaging works on CentOS and Scientific Linux. In this article, we describe in greater detail the systematic system of packaging we are using, and its particular application for the ITER CODAC Core System. (authors)

  11. The CASA Software Package

    Petry, Dirk

    2018-03-01

    CASA is the standard science data analysis package for ALMA and VLA but it can also be used for the analysis of data from other observatories. In this talk, I will give an overview of the structure and features of CASA, who develops it, and the present status and plans, and then show typical analysis workflows for ALMA data with special emphasis on the handling of single dish data and its combination with interferometric data.

  12. High Performance Computing Software Applications for Space Situational Awareness

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  13. PIV Data Validation Software Package

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  14. Development of high performance scientific components for interoperability of computing packages

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  15. Conference on High Performance Software for Nonlinear Optimization

    Murli, Almerico; Pardalos, Panos; Toraldo, Gerardo

    1998-01-01

    This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec­ tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa­ tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical...

  16. Component-based software for high-performance scientific computing

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  17. Component-based software for high-performance scientific computing

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  18. Software Systems for High-performance Quantum Computing

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  19. Decal Electronics: Printable Packaged with 3D Printing High-Performance Flexible CMOS Electronic Systems

    Sevilla, Galo T.

    2016-10-14

    High-performance complementary metal oxide semiconductor electronics are flexed, packaged using 3D printing as decal electronics, and then printed in roll-to-roll fashion for highly manufacturable printed flexible high-performance electronic systems.

  20. Decal Electronics: Printable Packaged with 3D Printing High-Performance Flexible CMOS Electronic Systems

    Sevilla, Galo T.; Cordero, Marlon D.; Nassar, Joanna M.; Hanna, Amir; Kutbee, Arwa T.; Carreno, Armando Arpys Arevalo; Hussain, Muhammad Mustafa

    2016-01-01

    High-performance complementary metal oxide semiconductor electronics are flexed, packaged using 3D printing as decal electronics, and then printed in roll-to-roll fashion for highly manufacturable printed flexible high-performance electronic systems.

  1. Software design practice using two SCADA software packages

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  2. The Future of Software Engineering for High Performance Computing

    Pope, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-16

    DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts of the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.

  3. Package-based software development

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  4. Intercomparison of gamma ray analysis software packages

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  5. Intercomparison of alpha particle spectrometry software packages

    1999-08-01

    Software has reached an important level as the 'logical controller' at different levels, from a single instrument to an entire computer-controlled experiment. This is also the case for software packages in nuclear instruments and experiments. In particular, because of the range of applications of alpha-particle spectrometry, software packages in this field are often used. It is the aim of this intercomparison to test and describe the abilities of four such software packages. The main objectives of the intercomparison were the ability of the programs to determine the peak areas and the peak area uncertainties, and the statistical control and stability of reported results. In this report, the task, methods and results of the intercomparison are presented in order to asist the potential users of such software and to stimulate the development of even better alpha-particle spectrum analysis software

  6. Intercomparison of PIXE spectrometry software packages

    2003-02-01

    During the year 2000, an exercise was organized to make a intercomparison of widely available software packages for analysis of particle induced X ray emission (PIXE) spectra. This TECDOC describes the method used in this intercomparison exercise and presents the results obtained. It also gives a general overview of the participating software packages. This includes basic information on their user interface, graphical presentation capabilities, physical phenomena taken in account, way of presenting results, etc. No recommendation for a particular software package or method for spectrum analysis is given. It is intended that the readers reach their own conclusions and make their own choices, according to their specific needs. This TECDOC will be useful to anyone involved in PIXE spectrum analysis. This TECDOC includes a companion CD with the complete set of test spectra used for intercomparison. The test spectra on this CD can be used to test any PIXE spectral analysis software package

  7. Validation of SCALE code package on high performance neutron shields

    Bace, M.; Jecmenica, R.; Smuc, T.

    1999-01-01

    The shielding ability and other properties of new high performance neutron shielding materials from the KRAFTON series have been recently published. A comparison of the published experimental and MCNP results for the two materials of the KRAFTON series, with our own calculations has been done. Two control modules of the SCALE-4.4 code system have been used, one of them based on one dimensional radiation transport analysis (SAS1) and other based on the three dimensional Monte Carlo method (SAS3). The comparison of the calculated neutron dose equivalent rates shows a good agreement between experimental and calculated results for the KRAFTON-N2 material.. Our results indicate that the N2-M-N2 sandwich type is approximately 10% inferior as neutron shield to the KRAFTON-N2 material. All values of neutron dose equivalent obtained by SAS1 are approximately 25% lower in comparison with the SAS3 results, which indicates proportions of discrepancies introduced by one-dimensional geometry approximation.(author)

  8. Introduction to Software Packages. [Final Report.

    Frankel, Sheila, Ed.; And Others

    This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…

  9. Software packages for food engineering needs

    Abakarov, Alik

    2011-01-01

    The graphic user interface (GUI) software packages “ANNEKs” and “OPT-PROx” are developed to meet food engineering needs. “OPT-RROx” (OPTimal PROfile) is software developed to carry out thermal food processing optimization based on the variable retort temperature processing and global optimization technique. “ANNEKs” (Artificial Neural Network Enzyme Kinetics) is software designed for determining the kinetics of enzyme hydrolysis of protein at different initial reaction parameters based on the...

  10. Software on the Peregrine System | High-Performance Computing | NREL

    on the Peregrine System Software on the Peregrine System NREL maintains a variety of applications environment modules for use on Peregrine. Applications View list of software applications by name and research area/discipline. Libraries View list of software libraries available for linking and loading

  11. Software package as an information center product

    Butler, M.K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables

  12. Software Package STATISTICA and Educational Process

    Demidova Liliya

    2016-01-01

    Full Text Available The paper describes the main aspects of application of the software package STATISTICA in the educational process. Technologies of data mining which can be useful for students researches have been considered. The main tools of these technologies have been discussed.

  13. Consys Linear Control System Design Software Package

    Diamantidis, Z.

    1987-01-01

    This package is created in order to help engineers, researchers, students and all who work on linear control systems. The software includes all time and frequency domain analysises, spectral analysises and networks, active filters and regulators design aids. The programmes are written on Hewlett Packard computer in Basic 4.0

  14. SPADE - software package to aid diffraction experiments

    Farren, J.; Giltrap, J.W.

    1978-10-01

    A software package is described which enables the DEC PDP-11/03 microcomputer to execute several different X-ray diffraction experiments and other similar experiments where stepper motors are driven and data is gathered and processed in real time. (author)

  15. Software Tools for Development on the Peregrine System | High-Performance

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  16. Automated packaging platform for low-cost high-performance optical components manufacturing

    Ku, Robert T.

    2004-05-01

    Delivering high performance integrated optical components at low cost is critical to the continuing recovery and growth of the optical communications industry. In today's market, network equipment vendors need to provide their customers with new solutions that reduce operating expenses and enable new revenue generating IP services. They must depend on the availability of highly integrated optical modules exhibiting high performance, small package size, low power consumption, and most importantly, low cost. The cost of typical optical system hardware is dominated by linecards that are in turn cost-dominated by transmitters and receivers or transceivers and transponders. Cost effective packaging of optical components in these small size modules is becoming the biggest challenge to be addressed. For many traditional component suppliers in our industry, the combination of small size, high performance, and low cost appears to be in conflict and not feasible with conventional product design concepts and labor intensive manual assembly and test. With the advent of photonic integration, there are a variety of materials, optics, substrates, active/passive devices, and mechanical/RF piece parts to manage in manufacturing to achieve high performance at low cost. The use of automation has been demonstrated to surpass manual operation in cost (even with very low labor cost) as well as product uniformity and quality. In this paper, we will discuss the value of using an automated packaging platform.for the assembly and test of high performance active components, such as 2.5Gb/s and 10 Gb/s sources and receivers. Low cost, high performance manufacturing can best be achieved by leveraging a flexible packaging platform to address a multitude of laser and detector devices, integration of electronics and handle various package bodies and fiber configurations. This paper describes the operation and results of working robotic assemblers in the manufacture of a Laser Optical Subassembly

  17. Human-machine interface software package

    Liu, D.K.; Zhang, C.Z.

    1992-01-01

    The Man-Machine Interface software Package (MMISP) is designed to configure the console software of PLS 60 Mev LINAC control system. The control system of PLS 60 Mev LINAC is a distributed control system which includes the main computer (Intel 310) four local station, and two sets of industrial level console computer. The MMISP provides the operator with the display page editor, various I/O configuration such as digital signals In/Out, analog signal In/Out, waveform TV graphic display, and interactive with operator through graphic picture display, voice explanation, and touch panel. This paper describes its function and application. (author)

  18. THE SOFTWARE PACKAGE FOR DATA STREAM SCRAMBLING

    P. A. Kadiev

    2016-01-01

    Full Text Available Abstract. It is proposed a software package for multivariate stepwise transformation of the text flow in order to increase resistance to protect against unauthorized access, and a package to restore the converted text. The basis of the proposals: the formation of nxn-array from the elements of a data flow, preliminary transposition of the array elements to form an array, each row and each column of which includes one and one only element from each row and each column of the source array, following reading on the options selected by the user.Package for direct conversion includes: a module for forming an array from the input flow; transposition module of array elements according to the scheme of Latin squares; reading module of rows or columns of the array to one of the following algorithms: sequential reading; reading of rows or columns with even indices and then odd ones;reading the row or column with odd indices, and then the even; reading at random route, which is generated by the program; reading at the route determined by the user.Package for restoring of the original message by the inverse transform comprises: a channel array forming module from the data flow; recovery module from the channel array - the array of Latin square type; the original array module; the original message restoring module. 

  19. Browndye: A software package for Brownian dynamics

    Huber, Gary A.; McCammon, J. Andrew

    2010-11-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. Program summaryProgram title: Browndye Catalogue identifier: AEGT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license, included in distribution No. of lines in distributed program, including test data, etc.: 143 618 No. of bytes in distributed program, including test data, etc.: 1 067 861 Distribution format: tar.gz Programming language: C++, OCaml ( http://caml.inria.fr/) Computer: PC, Workstation, Cluster Operating system: Linux Has the code been vectorised or parallelized?: Yes. Runs on multiple processors with shared memory using pthreads RAM: Depends linearly on size of physical system Classification: 3 External routines: uses the output of APBS [1] ( http://www.poissonboltzmann.org/apbs/) as input. APBS must be obtained and installed separately. Expat 2.0.1, CLAPACK, ocaml-expat, Mersenne Twister. These are included in the Browndye distribution. Nature of problem: Exploration and determination of rate constants of bimolecular interactions involving large biological molecules. Solution method: Brownian dynamics with electrostatic, excluded volume, van der Waals, and desolvation forces. Running time: Depends linearly on size of physical system and quadratically on precision of results. The included example executes in a few minutes.

  20. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    Campbell, Michael T. [Illinois Rocstar LLC, Champaign, IL (United States); Safdari, Masoud [Illinois Rocstar LLC, Champaign, IL (United States); Kress, Jessica E. [Illinois Rocstar LLC, Champaign, IL (United States); Anderson, Michael J. [Illinois Rocstar LLC, Champaign, IL (United States); Horvath, Samantha [Illinois Rocstar LLC, Champaign, IL (United States); Brandyberry, Mark D. [Illinois Rocstar LLC, Champaign, IL (United States); Kim, Woohyun [Illinois Rocstar LLC, Champaign, IL (United States); Sarwal, Neil [Illinois Rocstar LLC, Champaign, IL (United States); Weisberg, Brian [Illinois Rocstar LLC, Champaign, IL (United States)

    2016-10-15

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enable coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site

  1. Software Package for the Technical Support Centre

    Tomisa, T.; Skanata, D.; Sucic, B.

    2002-01-01

    The continued radiological surveillance system has been technically improved during the last two years by establishing 11 new automatic stations, so that there are currently 14 locations with installed gamma-monitors for air radiation monitoring on the Croatian national territory. Given that the original system had been designed primarily for gathering data for off-line treatment with the purpose of statistical analyses, the contemporary Radiological Early Warning System (SPRU) approach has demanded developing of a new software by the Technical Support Centre (TPC) in order to allow operators interactive work in the case of emergency situations. The outcome of this development is a software package called DORAP (Automatic Radiological Station Remote Reading), which brings together automatic functions of continual data gathering, daily production of the standard report, distribution of the report by fax, SMS (Short Message Service), SMT (Simple Mail Transfer) and FTP (File Transfer Protocol) as well as generation and distribution of alarms in the case of failure in the system or exceeding of the set radiation intensity values. (author)

  2. Accuracy of Giovanni and Marksim Software Packages for ...

    Accuracy of Giovanni and Marksim Software Packages for Generating Daily Rainfall Data in ... using Giovanni software over Marksim, for areas receiving bimodal rainfall regimes similar to ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  3. Adoption of open source digital library software packages: a survey

    Jose, Sanjo

    2007-01-01

    Open source digital library packages are gaining popularity nowadays. To build a digital library under economical conditions open source software is preferable. This paper tries to identify the extent of adoption of open source digital library software packages in various organizations through an online survey. It lays down the findings from the survey.

  4. The SAVI Vulnerability Analysis Software Package

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  5. Software refactoring at the package level using clustering techniques

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  6. International Inventory of Software Packages in the Information Field.

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  7. Comparison of PV system design software packages for urban applications

    Gharakhani Siraki, Arbi; Pillay, Pragasen

    2010-09-15

    A large number of software packages are available for solar resource evaluation and PV system design. However, few of them are suitable for urban applications. In this paper a comparison has been made between two specifically designed solar tools known as the Ecotect 2010 and the PVsyst 5.05. Conclusions have been made for proper use of these packages based on their specifications and privileges. Moreover, the calculations have been repeated with HOMER software package (which is a generic tool) for the same location. The results suggest that a generic solar software tool should not be used for an urban application.

  8. PCG: A software package for the iterative solution of linear systems on scalar, vector and parallel computers

    Joubert, W. [Los Alamos National Lab., NM (United States); Carey, G.F. [Univ. of Texas, Austin, TX (United States)

    1994-12-31

    A great need exists for high performance numerical software libraries transportable across parallel machines. This talk concerns the PCG package, which solves systems of linear equations by iterative methods on parallel computers. The features of the package are discussed, as well as techniques used to obtain high performance as well as transportability across architectures. Representative numerical results are presented for several machines including the Connection Machine CM-5, Intel Paragon and Cray T3D parallel computers.

  9. An Assessment of the Library Application Software Packages in ...

    Journal Home > Vol 7, No 2 (2007) > ... the study examined the adopted softwares' security, compatibility/capabilities, ... The study found that most application packages available in the Nigerian automation market place are effective since they ...

  10. The experimental modification of a computer software package for ...

    The experimental modification of a computer software package for graphing algebraic functions. ... No Abstract Available South African Journal of Education Vol.25(2) 2005: 61-68. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  11. Software package for analysis of completely randomized block design

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  12. High-performance packaging for monolithic microwave and millimeter-wave integrated circuits

    Shalkhauser, K. A.; Li, K.; Shih, Y. C.

    1992-01-01

    Packaging schemes are developed that provide low-loss, hermetic enclosure for enhanced monolithic microwave and millimeter-wave integrated circuits. These package schemes are based on a fused quartz substrate material offering improved RF performance through 44 GHz. The small size and weight of the packages make them useful for a number of applications, including phased array antenna systems. As part of the packaging effort, a test fixture was developed to interface the single chip packages to conventional laboratory instrumentation for characterization of the packaged devices.

  13. A Characteristics Approach to the Evaluation of Economics Software Packages.

    Lumsden, Keith; Scott, Alex

    1988-01-01

    Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)

  14. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  15. PIV/HPIV Film Analysis Software Package

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  16. A Lightweight, High-performance I/O Management Package for Data-intensive Computing

    Wang, Jun

    2011-06-22

    Our group has been working with ANL collaborators on the topic bridging the gap between parallel file system and local file system during the course of this project period. We visited Argonne National Lab -- Dr. Robert Ross's group for one week in the past summer 2007. We looked over our current project progress and planned the activities for the incoming years 2008-09. The PI met Dr. Robert Ross several times such as HEC FSIO workshop 08, SC08 and SC10. We explored the opportunities to develop a production system by leveraging our current prototype to (SOGP+PVFS) a new PVFS version. We delivered SOGP+PVFS codes to ANL PVFS2 group in 2008.We also talked about exploring a potential project on developing new parallel programming models and runtime systems for data-intensive scalable computing (DISC). The methodology is to evolve MPI towards DISC by incorporating some functions of Google MapReduce parallel programming model. More recently, we are together exploring how to leverage existing works to perform (1) coordination/aggregation of local I/O operations prior to movement over the WAN, (2) efficient bulk data movement over the WAN, (3) latency hiding techniques for latency-intensive operations. Since 2009, we start applying Hadoop/MapReduce to some HEC applications with LANL scientists John Bent and Salman Habib. Another on-going work is to improve checkpoint performance at I/O forwarding Layer for the Road Runner super computer with James Nuetz and Gary Gridder at LANL. Two senior undergraduates from our research group did summer internships about high-performance file and storage system projects in LANL since 2008 for consecutive three years. Both of them are now pursuing Ph.D. degree in our group and will be 4th year in the PhD program in Fall 2011 and go to LANL to advance two above-mentioned works during this winter break. Since 2009, we have been collaborating with several computer scientists (Gary Grider, John bent, Parks Fields, James Nunez, Hsing

  17. GPS Software Packages Deliver Positioning Solutions

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  18. Software Package for Optics Measurement and Correction in the LHC

    Aiba, M; Tomas, R; Vanbavinckhove, G

    2010-01-01

    A software package has been developed for the LHC on-line optics measurement and correction. This package includes several different algorithms to measure phase advance, beta functions, dispersion, coupling parameters and even some non-linear terms. A Graphical User Interface provides visualization tools to compare measurements to model predictions, fit analytical formula, localize error sources and compute and send corrections to the hardware.

  19. Comparison of four software packages applied to a scattering problem

    Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren

    1999-01-01

    We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...

  20. Automated load balancing in the ATLAS high-performance storage software

    Le Goff, Fabrice; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment collects proton-proton collision events delivered by the LHC accelerator at CERN. The ATLAS Trigger and Data Acquisition (TDAQ) system selects, transports and eventually records event data from the detector at several gigabytes per second. The data are recorded on transient storage before being delivered to permanent storage. The transient storage consists of high-performance direct-attached storage servers accounting for about 500 hard drives. The transient storage operates dedicated software in the form of a distributed multi-threaded application. The workload includes both CPU-demanding and IO-oriented tasks. This paper presents the original application threading model for this particular workload, discussing the load-sharing strategy among the available CPU cores. The limitations of this strategy were reached in 2016 due to changes in the trigger configuration involving a new data distribution pattern. We then describe a novel data-driven load-sharing strategy, designed to automatical...

  1. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brightwell, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In this paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.

  2. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  3. Description of the IV + V System Software Package.

    Microcomputers for Information Management: An International Journal for Library and Information Services, 1984

    1984-01-01

    Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…

  4. A software package for biomedical image processing and analysis

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  5. A User-Friendly Software Package for HIFU Simulation

    Soneson, Joshua E.

    2009-04-01

    A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.

  6. Article I. Multi-platform Automated Software Building and Packaging

    Rodriguez, A Abad; Gomes Gouveia, V E; Meneses, D; Capannini, F; Aimar, A; Di Meglio, A

    2012-01-01

    One of the major goals of the EMI (European Middleware Initiative) project is the integration of several components of the pre-existing middleware (ARC, gLite, UNICORE and dCache) into a single consistent set of packages with uniform distributions and repositories. Those individual middleware projects have been developed in the last decade by tens of development teams and before EMI were all built and tested using different tools and dedicated services. The software, millions of lines of code, is written in several programming languages and supports multiple platforms. Therefore a viable solution ought to be able to build and test applications on multiple programming languages using common dependencies on all selected platforms. It should, in addition, package the resulting software in formats compatible with the popular Linux distributions, such as Fedora and Debian, and store them in repositories from which all EMI software can be accessed and installed in a uniform way. Despite this highly heterogeneous initial situation, a single common solution, with the aim of quickly automating the integration of the middleware products, had to be selected and implemented in a few months after the beginning of the EMI project. Because of the previous knowledge and the short time available in which to provide this common solution, the ETICS service, where the gLite middleware was already built for years, was selected. This contribution describes how the team in charge of providing a common EMI build and packaging infrastructure to the whole project has developed a homogeneous solution for releasing and packaging the EMI components from the initial set of tools used by the earlier middleware projects. An important element of the presentation is the developers experience and feedback on converging on ETICS and on the on-going work in order to finally add more widely used and supported build and packaging solutions of the Linux platforms

  7. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  8. Development of a software system for spatial resolved trace analysis of high performance materials with SIMS

    Brunner, Ch. H.

    1997-09-01

    The following work is separated into two distinctly different parts. The first one is dealing with the SIMSScan software project, an application system for secondary ion mass spectrometry. This application system primarily lays down the foundation, for the research activity introduced in the second part of this work. SIMSScan is an application system designed to provide data acquisition routines for different requirements in the field of secondary ion mass spectroscopy. The whole application package is divided into three major sections, each one dealing with specific measurement tasks. Various supporting clients and wizards, providing extended functionality to the main application, build the core of the software. The MassScan as well as the DepthScan module incorporate the SIMS in the direct imaging or stigmatic mode and are featuring the capabilities for mass spectra recording or depth profile analysis. In combination with an image recording facility the DepthScan module features the capability of spatial resolved material analysis - 3D SIMS. The RasterScan module incorporates the SIMS in scanning mode and supports an fiber optical link for optimized data transfer. The primary goal of this work is to introduce the basic ideas behind the implementation of the main application modules and the supporting clients. Furthermore, it is the intention to lay down the foundation for further developments. At the beginning a short introduction into the paradigm of object oriented programming as well as Windows TM programming is given. Besides explaining the basic ideas behind the Doc/View application architecture the focus is mainly shifted to the routines controlling the SIMS hardware and the basic concepts of multithreaded programming. The elementary structures of the view and document objects is discussed in detail only for the MassScan module, because the ideas behind data abstraction and encapsulation are quite similar. The second part introduces the research activities

  9. Current status and future direction of the MONK software package

    Smith, Nigel; Armishaw, Malcolm; Cooper, Andrew

    2003-01-01

    The current status of the MONK criticality software package is summarized in terms of recent and current developments and envisaged directions for the future. The areas of the discussion are physics modeling, geometry modeling, source modeling, nuclear data, validation, supporting tools and customer services. In future development plan, MONK continues to be focused on meeting the short and long-term needs of the code user community. (J.P.N.)

  10. Chinshan living PRA model using NUPRA software package

    Cheng, S.-K.; Lin, T.-J.

    2004-01-01

    A living probabilistic risk assessment (PRA) model has been established for Chinshan Nuclear Power Station (BWR-4, MARK-I) using NUPRA software package. The core damage frequency due to internal events, seismic events and typhoons are evaluated in this model. The methodology and results considering the recent implementation of the 5th emergency diesel generator and automatic boron injection function are presented. The dominant sequences of this PRA model are discussed, and some possible applications of this living model are proposed. (author)

  11. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Marković Nemanja; Stojić Dragoslav; Cvetković Radovan

    2014-01-01

    Reinforced concrete (AB) is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP), Smeared Concrete Cr...

  12. High-performance polyimide nanocomposites with core-shell AgNWs@BN for electronic packagings

    Zhou, Yongcun; Liu, Feng, E-mail: liufeng@nwpu.edu.cn [State Key Laboratory of Solidification Processing, Northwestern Polytechnical University, Xi' an Shaanxi 710072 (China)

    2016-08-22

    The increasing density of electronic devices underscores the need for efficient thermal management. Silver nanowires (AgNWs), as one-dimensional nanostructures, possess a high aspect ratio and intrinsic thermal conductivity. However, high electrical conductivity of AgNWs limits their application for electronic packaging. We synthesized boron nitride-coated silver nanowires (AgNWs@BN) using a flexible and fast method followed by incorporation into synthetic polyimide (PI) for enhanced thermal conductivity and dielectric properties of nanocomposites. The thinner boron nitride intermediate nanolayer on AgNWs not only alleviated the mismatch between AgNWs and PI but also enhanced their interfacial interaction. Hence, the maximum thermal conductivity of an AgNWs@BN/PI composite with a filler loading up to 20% volume was increased to 4.33 W/m K, which is an enhancement by nearly 23.3 times compared with that of the PI matrix. The relative permittivity and dielectric loss were about 9.89 and 0.015 at 1 MHz, respectively. Compared with AgNWs@SiO{sub 2}/PI and Ag@BN/PI composites, boron nitride-coated core-shell structures effectively increased the thermal conductivity and reduced the permittivity of nanocomposites. The relative mechanism was studied and discussed. This study enables the identification of appropriate modifier fillers for polymer matrix nanocomposites.

  13. A novel conductive-polymer-based integration process for high-performance flip-chip packages

    Lohokare, Saurabh

    Conductive polymers have recently attracted considerable attention for low-temperature fabrication of lead-free, reworkable, and flexible flip-chip interconnects. Using these materials, I demonstrate in this thesis a process that enables low-cost and high-resolution flip-chip interconnects using conventional micro-fabrication techniques. This fabrication process offers improved performance as compared to conventional flip-chip techniques, such as screen-printing, and allows for definition of interconnects with excellent surface uniformity and control over the bump profile. In order to demonstrate the utility and wide applicability of this process, several test implementations that serve as case studies were investigated. Specifically, novel InGaAsSb avalanche photodiodes (APDs), operating around lambda = 2m and targeted for free-space communication and biomedical spectroscopy applications, were fabricated and flip-chip-integrated to test the static electrical characteristics of the polymer bumps. Additionally, the dynamic electrical performance characteristics of the polymer bumps were studied by using AlGaAsSb/AlGaSb p-i-n photodetectors as a case study. The fabrication of these photodetectors, operating around lambda = 1.55mum and targeted for optical communication applications, was accomplished using a customized inductively coupled plasma (ICP) etch process that resulted in a low dark current and excellent speed (3dB bandwidth of 10GHz) and, responsivity (60% external quantum efficiency) characteristics. Furthermore, flip-chip integration was used to demonstrate a three-dimensional, point-to-point micro-optical interconnect, which was 2.33mm-long in a system 15.27mm3 in volume. Lastly, high-speed parallel optical interconnects were demonstrated using polymer-flip-chip-integrated 10GHz vertical-cavity surface-emitting laser (VCSEL) and DOEs. Such interconnects offer the ability to alleviate the communication bottleneck that is projected to occur in future, high-performance

  14. STAR-GENERIS - a software package for information processing

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  15. Determination of phthalates released from paper packaging materials by solid-phase extraction-high-performance liquid chromatography.

    Gao, Xin; Yang, Bofeng; Tang, Zhixu; Luo, Xin; Wang, Fengmei; Xu, Hui; Cai, Xue

    2014-01-01

    A solid phase extraction (SPE) high-performance liquid chromatography (HPLC) method was developed for the simultaneous determination of 10 phthalic acid esters (dimethyl phthalate, diethyl phthalate, dipropyl phthalate, benzylbutyl phthalate, diisobutyl phthalate, dicyclohexyl phthalate, diamyl phthalate, di-n-hexyl phthalate, di-n-octyl phthalate and di-2-ethylhexyl phthalate) released from food paper packaging materials. The use of distilled water, 3% acetic acid (w/v), 10% ethanol (v/v) and 95% ethanol (v/v) instead of the different types of food simulated the migration of 10 phthalic acid esters from food paper packaging materials; the phthalic acid esters in four food simulants were enriched and purified by a C18 SPE column and nitrogen blowing, and quantified by HPLC with a diode array detector. The chromatographic conditions and extraction conditions were optimized and all 10 of the phthalate acid esters had a maximum absorbance at 224 nm. The method showed limitations of detection in the range of 6.0-23.8 ng/mL the correlation coefficients were greater than 0.9999 in all cases, recovery values ranged between 71.27 and 106.97% at spiking levels of 30, 60 and 90 ng/mL and relative standard deviation values ranged from 0.86 to 8.00%. The method was considered to be simple, fast and reliable for a study on the migration of these 10 phthalic acid esters from food paper packaging materials into food.

  16. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  17. PINT, A Modern Software Package for Pulsar Timing

    Luo, Jing; Ransom, Scott M.; Demorest, Paul; Ray, Paul S.; Stovall, Kevin; Jenet, Fredrick; Ellis, Justin; van Haasteren, Rutger; Bachetti, Matteo; NANOGrav PINT developer team

    2018-01-01

    Pulsar timing, first developed decades ago, has provided an extremely wide range of knowledge about our universe. It has been responsible for many important discoveries, such as the discovery of the first exoplanet and the orbital period decay of double neutron star systems. Currently pulsar timing is the leading technique for detecting low frequency (about 10^-9 Hertz) gravitational waves (GW) using an array of pulsars as the detectors. To achieve this goal, high precision pulsar timing data, at about nanoseconds level, is required. Most high precision pulsar timing data are analyzed using the widely adopted software TEMPO/TEMPO2. But for a robust and believable GW detection, it is important to have independent software that can cross-check the result. In this poster we present the new generation pulsar timing software PINT. This package will provide a robust system to cross check high-precision timing results, completely independent of TEMPO and TEMPO2. In addition, PINT is designed to be a package that is easy to extend and modify, through use of flexible code architecture and a modern programming language, Python, with modern technology and libraries.

  18. FRAMES Software System: Linking to the Statistical Package R

    Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.

    2006-12-11

    This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.

  19. Development of New Low-Cost, High-Performance, PV Module Encapsulant/Packaging Materials: Final Technical Progress Report, 22 October 2002 - 15 November 2007

    Tucker, R.

    2008-04-01

    Report on objectives to work with U.S.-based PV module manufacturers (c-Si, a-Si, CIS, other thin films) to develop/qualify new low-cost, high-performance PV module encapsulant/packaging materials, and processes using the packaging materials.

  20. [Simultaneous determination of six fluorescent whitening agents in plastic and paper packaging materials by high performance liquid chromatography].

    Zhang, Juzhou; Ji, Shuilin; Cai, Huimei; Li, Jing; Wang, Yongxin; Wang, Jingqiu

    2017-11-08

    A novel analytical method was developed for the simultaneous determination of six fluorescent whitening agents (FWAs:FWA 135, FWA 184, FWA 185, FWA 199, FWA 378 and FWA 393) in paper and plastic food packaging materials by high performance liquid chromatography with fluorescence detection (HPLC-FLD). The sample was extracted with mixed solution of chloroform and acetonitrile (3:7, v/v), then cleaned up by HLB solid phase extraction column. Qualitative and quantitative analyses were carried out by HPLC. The sample was separated on a Phenomenex C18 column using acetonitrile and 5 mmol/L ammonium acetate aqueous solution as mobile phases. The results indicated that the linear range of FWA393 was 15-1500 μg/L and the linear ranges of the other five FWAs were 5-500 μg/L with correlation coefficients greater than 0.999. The recoveries in spiked samples were between 80.4% and 125.0% with RSDs ( n =6) of 1%-13%. Furthermore, this method was applied to analyze 12 samples in the market to verify the practicality of the method. The method showed the advantages of simplicity, high recovery and good precision, and is suitable for the detection of the six fluorescent whitening agents in food packaging materials.

  1. SEDA: A software package for the Statistical Earthquake Data Analysis

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  2. Scilab software package for the study of dynamical systems

    Bordeianu, C. C.; Beşliu, C.; Jipa, Al.; Felea, D.; Grossu, I. V.

    2008-05-01

    This work presents a new software package for the study of chaotic flows and maps. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well known examples are implemented, with the capability of the users inserting their own ODE. Program summaryProgram title: Chaos Catalogue identifier: AEAP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 885 No. of bytes in distributed program, including test data, etc.: 5925 Distribution format: tar.gz Programming language: Scilab 3.1.1 Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 100 Megabytes Classification: 6.2 Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations. The chaotic behavior of the nonlinear dynamical system is analyzed using Poincaré sections, phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropies. Restrictions: The package routines are normally able to handle ODE systems of high orders (up to order twelve and possibly higher), depending on the nature of the problem. Running time: 10 to 20 seconds for problems that do not

  3. Radiative transfer through terrestrial atmosphere and ocean: Software package SCIATRAN

    Rozanov, V.V.; Rozanov, A.V.; Kokhanovsky, A.A.; Burrows, J.P.

    2014-01-01

    SCIATRAN is a comprehensive software package for the modeling of radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40μm) including multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The software is capable of modeling spectral and angular distributions of the intensity or the Stokes vector of the transmitted, scattered, reflected, and emitted radiation assuming either a plane-parallel or a spherical atmosphere. Simulations are done either in the scalar or in the vector mode (i.e. accounting for the polarization) for observations by space-, air-, ship- and balloon-borne, ground-based, and underwater instruments in various viewing geometries (nadir, off-nadir, limb, occultation, zenith-sky, off-axis). All significant radiative transfer processes are accounted for. These are, e.g. the Rayleigh scattering, scattering by aerosol and cloud particles, absorption by gaseous components, and bidirectional reflection by an underlying surface including Fresnel reflection from a flat or roughened ocean surface. The software package contains several radiative transfer solvers including finite difference and discrete-ordinate techniques, an extensive database, and a specific module for solving inverse problems. In contrast to many other radiative transfer codes, SCIATRAN incorporates an efficient approach to calculate the so-called Jacobians, i.e. derivatives of the intensity with respect to various atmospheric and surface parameters. In this paper we discuss numerical methods used in SCIATRAN to solve the scalar and vector radiative transfer equation, describe databases of atmospheric, oceanic, and surface parameters incorporated in SCIATRAN, and demonstrate how to solve some selected radiative transfer problems using the SCIATRAN package. During the last decades, a lot of studies have been published demonstrating that SCIATRAN is a valuable

  4. 33rd International School of Mathematics "G Stampacchia ": High Performance Algorithms and Software for Nonlinear Optics "Ettore Majorana"

    Murli, Almerico; High Performance Algorithms and Software for Nonlinear Optics

    2003-01-01

    This volume contains the edited texts of the lectures presented at the Workshop on High Performance Algorithms and Software for Nonlinear Optimization held in Erice, Sicily, at the "G. Stampacchia" School of Mathematics of the "E. Majorana" Centre for Scientific Culture, June 30 - July 8, 2001. In the first year of the new century, the aim of the Workshop was to assess the past and to discuss the future of Nonlinear Optimization, and to highlight recent achieve­ ments and promising research trends in this field. An emphasis was requested on algorithmic and high performance software developments and on new computational experiences, as well as on theoretical advances. We believe that such goal was basically achieved. The Workshop was attended by 71 people from 22 countries. Although not all topics were covered, the presentations gave indeed a wide overview of the field, from different and complementary stand­ points. Besides the lectures, several formal and informal discussions took place. We wish ...

  5. Distributed control software of high-performance control-loop algorithm

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  6. Software Packages to Support Electrical Engineering Virtual Lab

    Manuel Travassos Valdez

    2012-03-01

    Full Text Available The use of Virtual Reality Systems (VRS, as a learning aid, encourages the creation of tools that allow users/students to simulate educational environments on a computer. This article presents a way of building a VRS system with Software Packages to support Electrical Engineering Virtual Laboratories to be used in a near future in the teaching of the curriculum unit of Circuit Theory. The steps required for the construction of a project are presented in this paper. The simulation is still under construction and intends to use a three-dimensional virtual environment laboratory electric measurement, which will allow users/students to experiment and test the modeled equipment. Therefore, there are still no links available for further examination. The result may demonstrate the future potential of applications of Virtual Reality Systems as an efficient and cost-effective learning system.

  7. Lenstronomy: Multi-purpose gravitational lens modeling software package

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  8. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  9. [Determination of formaldehyde and acetaldehyde in packaging paper by dansylhydrazine derivatization-high performance liquid chromatography-fluorescence detection].

    Gong, Shuguo; Liang, Yong; Tang, Liyun; Huang, Ping; Dai, Yunhui

    2017-07-08

    A high performance liquid chromatography with fluorescence detection (HPLC-FLD) method was developed for the simultaneous determination of formaldehyde and acetaldehyde in packaging paper by dansylhydrazine (DNSH) derivatization. The samples were extracted by derivatization reagent for 30 min, and derived for 24 h. After purifying treatment with a PSA/C18 cartridge, a Diamonsil ® C18 column (150 mm×4.6 mm, 5 μ m) was used as stationary phase for separation, the mixtures of acetic acid aqueous solution (pH 2.55)-acetonitrile were used as mobile phases by gradient elution, and the excitation and emission wavelengths were 330 nm and 484 nm, respectively. The results showed that the recoveries of formaldehyde and acetaldehyde spiked in the samples were 81.64%-106.78%, and the relative standard deviations (RSDs) were 2.02%-5.53% ( n =5). The limits of detection of formaldehyde and acetaldehyde were 19.2 μ g/kg and 20.7 μ g/kg, respectively. The limits of quantification of formaldehyde and acetaldehyde were 63.9 μ g/kg and 69.1 μ g/kg, respectively. The method is simple, sensitive and reproducible. It provides a basic approach for the determination of trace formaldehyde and acetaldehyde.

  10. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  12. Software refactoring at the package level using clustering techniques

    Alkhalid, A.; Alshayeb, M.; Mahmoud, S. A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence

  13. Development of an engine system simulation software package - ESIM

    Erlandsson, Olof

    2000-10-01

    A software package, ESIM is developed for simulating internal combustion engine systems, including models for engine, manifolds, turbocharger, charge-air cooler (inter cooler) and inlet air heater. This study focus on the thermodynamic treatment and methods used in the models. It also includes some examples of system simulations made with these models for validation purposes. The engine model can be classified as a zero-dimensional, single zone model. It includes calculation of the valve flow process, models for heat release and models for in-cylinder, exhaust port and manifold heat transfer. Models are developed for handling turbocharger performance and charge air cooler characteristics. The main purpose of the project related to this work is to use the ESIM software to study heat balance and performance of homogeneous charge compression ignition (HCCI) engine systems. A short description of the HCCI engine is therefore included, pointing out the difficulties, or challenges regarding the HCCI engine, from a system perspective. However, the relations given here, and the code itself, is quite general, making it possible to use these models to simulate spark ignited, as well as direct injected engines.

  14. UES: an optimization software package for power and energy

    Vohryzek, J.; Havlena, V.; Findejs, J.; Jech, J.

    2004-01-01

    Unified Energy Solutions components are designed to meet specific requirements of the electric utilities, industrial power units, and district heating (combined heat and power) plants. The optimization objective is to operate the plant with maximum process efficiency and operational profit under the constraints imposed by technology and environmental impacts. Software applications for advanced control real-time optimization may provide a low-cost, high return alternative to expensive boiler retrofits for improving operational profit as well as reducing emissions. Unified Energy Solutions (UES) software package is a portfolio of advanced control and optimization components running on top of the standard process regulatory and control system. The objective of the UES is to operate the plant with maximum achievable profit (maximum efficiency) under the constraints imposed by technology (life-time consumption, asset health) and environmental impacts (CO and NO x emissions). Fast responsiveness to varying economic conditions and integration of real-time optimization and operator decision support (off-line) features are critical for operation in real-time economy. Optimization Features are targeted to combustion process, heat and power load allocation to parallel resources, electric power delivery and ancillary services. Optimization Criteria include increased boiler thermal efficiency, maintaining emission limits, economic load allocation of the heat and generation sources. State-of-the-art advanced control algorithms use model based predictive control principles and provide superior response in transient states. Individual software modules support open control platforms and communication protocols. UES can be implemented on a wide range of distributed control systems. Typical achievable benefits include heat and power production costs savings, increased effective boiler operation range, optimized flue gas emissions, optimized production capacity utilization, optimized

  15. Optimized Architectural Approaches in Hardware and Software Enabling Very High Performance Shared Storage Systems

    CERN. Geneva

    2004-01-01

    There are issues encountered in high performance storage systems that normally lead to compromises in architecture. Compute clusters tend to have compute phases followed by an I/O phase that must move data from the entire cluster in one operation. That data may then be shared by a large number of clients creating unpredictable read and write patterns. In some cases the aggregate performance of a server cluster must exceed 100 GB/s to minimize the time required for the I/O cycle thus maximizing compute availability. Accessing the same content from multiple points in a shared file system leads to the classical problems of data "hot spots" on the disk drive side and access collisions on the data connectivity side. The traditional method for increasing apparent bandwidth usually includes data replication which is costly in both storage and management. Scaling a model that includes replicated data presents additional management challenges as capacity and bandwidth expand asymmetrically while the system is scaled. ...

  16. Information technologies and software packages for education of specialists in materials science [In Russian

    Krzhizhanovskaya, V.; Ryaboshuk, S.

    2009-01-01

    This paper presents methodological materials, interactive text-books and software packages developed and extensively used for education of specialists in materials science. These virtual laboratories for education and research are equipped with tutorials and software environment for modeling complex

  17. Strategic Business-IT alignment of application software packages: Bridging the Information Technology gap

    Wandi Kruger

    2012-09-01

    Full Text Available An application software package implementation is a complex endeavour, and as such it requires the proper understanding, evaluation and redefining of the current business processes to ensure that the implementation delivers on the objectives set at the start of the project. Numerous factors exist that may contribute to the unsuccessful implementation of application software packages. However, the most significant contributor to the failure of an application software package implementation lies in the misalignment of the organisation’s business processes with the functionality of the application software package. Misalignment is attributed to a gap that exists between the business processes of an organisation and what functionality the application software package has to offer to translate the business processes of an organisation into digital form when implementing and configuring an application software package. This gap is commonly referred to as the information technology (IT gap. This study proposes to define and discuss the IT gap. Furthermore this study will make recommendations for aligning the business processes with the functionality of the application software package (addressing the IT gap. The end result of adopting these recommendations will be more successful application software package implementations.

  18. IDES: Interactive Data Entry System: a generalized data acquisition software package

    Gasser, S.B.

    1980-04-01

    The Interactive Data Entry System (IDES) is a software package which greatly assists in designing and storing forms to be used for the directed acquisition of data. Objective of this package is to provide a viable man/machine interface to any comprehensive data base. This report provides a technical description of the software and can be used as a user's manual

  19. US Army Radiological Bioassay and Dosimetry: The RBD software package

    Eckerman, K.F.; Ward, R.C.; Maddox, L.B.

    1993-01-01

    The RBD (Radiological Bioassay and Dosimetry) software package was developed for the U. S. Army Material Command, Arlington, Virginia, to demonstrate compliance with the radiation protection guidance 10 CFR Part 20 (ref. 1). Designed to be run interactively on an IBM-compatible personal computer, RBD consists of a data base module to manage bioassay data and a computational module that incorporates algorithms for estimating radionuclide intake from either acute or chronic exposures based on measurement of the worker's rate of excretion of the radionuclide or the retained activity in the body. In estimating the intake,RBD uses a separate file for each radionuclide containing parametric representations of the retention and excretion functions. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent. For a given nuclide, if measurements exist for more than one type of assay, an auxiliary module, REPORT, estimates the intake by applying weights assigned in the nuclide file for each assay. Bioassay data and computed results (estimates of intake and committed dose equivalent) are stored in separate data bases, and the bioassay measurements used to compute a given result can be identified. The REPORT module creates a file containing committed effective dose equivalent for each individual that can be combined with the individual's external exposure

  20. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  1. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  2. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    Jeffrey K Noel

    2016-03-01

    Full Text Available Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  3. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  4. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  5. Deriving stellar parameters with the SME software package

    Piskunov, N.

    2017-09-01

    Photometry and spectroscopy are complementary tools for deriving accurate stellar parameters. Here I present one of the popular packages for stellar spectroscopy called SME with the emphasis on the latest developments and error assessment for the derived parameters.

  6. Using packaged software for solving two differential equation problems that arise in plasma physics

    Gaffney, P.W.

    1980-01-01

    Experience in using packaged numerical software for solving two related problems that arise in Plasma physics is described. These problems are (i) the solution of the reduced resistive MHD equations and (ii) the solution of the Grad-Shafranov equation

  7. Development of a new control software package for Pakistan Research Reactor-2

    Qazi, M.K.

    1993-05-01

    The development of a new control software package for Pakistan Research Reactor-2 is presented. The software operates in different modes which comprises of surveillance, pre-operational self tests, operator, supervisor and robotic control. The control logic critically damp the system minimizing power overshoots. The software, handles multiple abnormal conditions, provides an elaborate access control and maintains startup/shutdown record. The report describes the functional details and covers the operational aspects of the new control software. (author)

  8. Computer aided piping layout design in radiochemical plants- an improved software package

    Raju, R.P.; Siddiqui, H.R.

    1995-01-01

    A software package was developed and it was successfully implemented for the piping layout design of the four process cells of the Kalpakkam Reprocessing Project. This paper discusses in detail all the improvements and modifications that are being carried out in the package so that it becomes more meaningful and useful for implementation for the forthcoming radiochemical plants

  9. Western aeronautical test range real-time graphics software package MAGIC

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  10. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  11. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    Sang-Kyu Jung

    Full Text Available Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  12. Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    Brust, Frederick W. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Punch, Edward F. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Twombly, Elizabeth Kurth [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Kalyanam, Suresh [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Kennedy, James [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Hattery, Garty R. [Engineering Mechanics Corporation of Columbus, Columbus, OH (United States); Dodds, Robert H. [Professional Consulting Services, Inc., Lisle, IL (United States); Mach, Justin C [Caterpillar, Peoria, IL (United States); Chalker, Alan [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Nicklas, Jeremy [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Gohar, Basil M [Ohio Supercomputer Center (OSC), Columbus, OH (United States); Hudak, David [Ohio Supercomputer Center (OSC), Columbus, OH (United States)

    2016-12-30

    This report summarizes the final product developed for the US DOE Small Business Innovation Research (SBIR) Phase II grant made to Engineering Mechanics Corporation of Columbus (Emc2) between April 16, 2014 and August 31, 2016 titled ‘Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures’. Many US companies have moved fabrication and production facilities off shore because of cheaper labor costs. A key aspect in bringing these jobs back to the US is the use of technology to render US-made fabrications more cost-efficient overall with higher quality. One significant advantage that has emerged in the US over the last two decades is the use of virtual design for fabrication of small and large structures in weld fabrication industries. Industries that use virtual design and analysis tools have reduced material part size, developed environmentally-friendly fabrication processes, improved product quality and performance, and reduced manufacturing costs. Indeed, Caterpillar Inc. (CAT), one of the partners in this effort, continues to have a large fabrication presence in the US because of the use of weld fabrication modeling to optimize fabrications by controlling weld residual stresses and distortions and improving fatigue, corrosion, and fracture performance. This report describes Emc2’s DOE SBIR Phase II final results to extend an existing, state-of-the-art software code, Virtual Fabrication Technology (VFT®), currently used to design and model large welded structures prior to fabrication - to a broader range of products with widespread applications for small and medium-sized enterprises (SMEs). VFT® helps control distortion, can minimize and/or control residual stresses, control welding microstructure, and pre-determine welding parameters such as weld-sequencing, pre-bending, thermal-tensioning, etc. VFT® uses material properties, consumable properties, etc. as inputs

  13. Effective organizational solutions for implementation of DBMS software packages

    Jones, D.

    1984-01-01

    The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.

  14. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  15. QuickDirect - Payload Control Software Template Package, Phase I

    National Aeronautics and Space Administration — To address the need to quickly, cost-effectively and reliably develop software to control science instruments deployed on spacecraft, QuickFlex proposes to create a...

  16. BEANS - a software package for distributed Big Data analysis

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  17. Creating a simulation model of software testing using Simulink package

    V. M. Dubovoi

    2016-12-01

    Full Text Available The determination of the solution model of software testing that allows prediction both the whole process and its specific stages is actual for IT-industry. The article focuses on solving this problem. The aim of the article is prediction the time and improvement the quality of software testing. The analysis of the software testing process shows that it can be attributed to the branched cyclic technological processes because it is cyclical with decision-making on control operations. The investigation uses authors' previous works andsoftware testing process method based on Markov model. The proposed method enables execution the prediction for each software module, which leads to better decision-making of each controlled suboperation of all processes. Simulink simulation model shows implementation and verification of results of proposed technique. Results of the research have practically implemented in the IT-industry.

  18. Investigating the effects of different factors on development of open source enterprise resources planning software packages

    Mehdi Ghorbaninia

    2014-08-01

    Full Text Available This paper investigates the effects of different factors on development of open source enterprise resources planning software packages. The study designs a questionnaire in Likert scale and distributes it among 210 experts in the field of open source software package development. Cronbach alpha has been calculated as 0.93, which is well above the minimum acceptable level. Using Pearson correlation as well as stepwise regression analysis, the study determines three most important factors including fundamental issues, during and after implementation of open source software development. The study also determines a positive and strong relationship between fundamental factors and after implementation factors (r=0.9006, Sig. = 0.000.

  19. An Ada Linear-Algebra Software Package Modeled After HAL/S

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  20. High Performance Microaccelerometer with Wafer-level Hermetic Packaged Sensing Element and Continuous-time BiCMOS Interface Circuit

    Ko, Hyoungho; Park, Sangjun; Paik, Seung-Joon; Choi, Byoung-doo; Park, Yonghwa; Lee, Sangmin; Kim, Sungwook; Lee, Sang Chul; Lee, Ahra; Yoo, Kwangho; Lim, Jaesang; Cho, Dong-il

    2006-01-01

    A microaccelerometer with highly reliable, wafer-level packaged MEMS sensing element and fully differential, continuous time, low noise, BiCMOS interface circuit is fabricated. The MEMS sensing element is fabricated on a (111)-oriented SOI wafer by using the SBM (Sacrificial/Bulk Micromachining) process. To protect the silicon structure of the sensing element and enhance the reliability, a wafer level hermetic packaging process is performed by using a silicon-glass anodic bonding process. The interface circuit is fabricated using 0.8 μm BiCMOS process. The capacitance change of the MEMS sensing element is amplified by the continuous-time, fully-differential transconductance input amplifier. A chopper-stabilization architecture is adopted to reduce low-frequency noise including 1/f noise. The fabricated microaccelerometer has the total noise equivalent acceleration of 0.89 μg/√Hz, the bias instability of 490 μg, the input range of ±10 g, and the output nonlinearity of ±0.5 %FSO

  1. SIMODIS - a software package for simulating nuclear reactor components

    Guimaraes, Lamartine; Borges, Eduardo M.

    2000-01-01

    In this paper it is presented the initial development effort in building a nuclear reactor component simulation package. This package was developed to be used in the MATLAB simulation environment. It uses the graphical capabilities from MATLAB and the advantages of compiled languages, as for instance FORTRAN and C ++ . From the MATLAB it takes the facilities for better displaying the calculated results. From the compiled languages it takes processing speed. So far models from reactor core, UTSG and OTSG have been developed. Also, a series a user-friendly graphical interfaces have been developed for the above models. As a by product a set of water and sodium thermal and physical properties have been developed and may be used directly as a function from MATLAB, or by being called from a model, as part of its calculation process. The whole set was named SIMODIS, which stands for SIstema MODular Integrado de Simulacao. (author)

  2. The quality and testing PH-SFT infrastructure for the external LHC software packages deployment

    CERN. Geneva; MENDEZ LORENZO, Patricia; MATO VILA, Pere

    2015-01-01

    The PH-SFT group is responsible for the build, test, and deployment of the set of external software packages used by the LHC experiments. This set includes ca. 170 packages including Grid packages and Montecarlo generators provided for different versions. A complete build structure has been established to guarantee the quality of the packages provided by the group. This structure includes an experimental build and three daily nightly builds, each of them dedicated to a specific ROOT version including v6.02, v6.04, and the master. While the former build is dedicated to the test of new packages, versions and dependencies (basically SFT internal used), the three latter ones are the responsible for the deployment to AFS of the set of stable and well tested packages requested by the LHC experiments so they can apply their own builds on top. In all cases, a c...

  3. A process control software package for the SRS

    Atkins, V.R.; Poole, D.E.; Rawlinson, W.R.

    1980-03-01

    The development of software to give high level access from application programs for monitoring and control of the Daresbury Synchrotron Radiation Source on a network-wide basis is described. The design and implementation of the control system database, a special supervisor call and and 'executive' type task handling of all process input/output services for the 7/32 (which runs under 05/32-MT), and process control 'device driver' software for the 7/16 (run under L5/16-MT) are included. (UK)

  4. Evaluation of open source data mining software packages

    Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt

    2009-01-01

    Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...

  5. Analysis of isothiazolinone biocides in paper for food packaging by ultra-high-performance liquid chromatography-tandem mass spectrometry.

    Lin, Q-B; Wang, T-J; Song, H; Li, B

    2010-12-01

    A novel and simple method to detect isothiazolinone-type biocides (2-methyl-3-isothiazolinone (MI), 5-chloro-2-methyl-3-isothiazolinone (CMI), 1,2-benzisothiazolinone (BIT) and 2-octyl-3-isothiazolinone (OIT)) in paper used for food packaging by ultrasonic extraction coupled with UPLC-MS/MS was developed. Parameters affecting process efficiency such as extraction solvents, UPLC mobile phase, gradient elution procedure and MS/MS conditions were studied to optimise the operating conditions. Using the optimised gradient elution procedure, the retention time was less than 6 min. The limits of detection (LODs) were found to be between 0.001 and 0.010 mg kg⁻¹, which was validated using actual concentrations. After diluting the standard solution with a blank matrix, the linear calibration curve ranges were 0.002-1.000 mg kg⁻¹ for BIT and OIT, 0.005-1.000 mg kg⁻¹ for MI, and 0.020-1.000 mg kg⁻¹ for CMI, with correlation coefficients higher than 0.9985 (n = 6). A good level of precision with a mean recovery greater than 81.3% and a relative standard deviation (RSD) less than 6.2% were also obtained. A methodology has been proposed for the analysis of isothiazolinones in paper.

  6. A Relative Comparison of Leading Supply Chain Management Software Packages

    Zhongxian Wang; Ruiliang Yan; Kimberly Hollister; Ruben Xing

    2009-01-01

    Supply Chain Management (SCM) has proven to be an effective tool that aids companies in the development of competitive advantages. SCM Systems are relied on to manage warehouses, transportation, trade logistics and various other issues concerning the coordinated movement of products and services from suppliers to customers. Although in today’s fast paced business environment, numerous supply chain solution tools are readily available to companies, choosing the right SCM software is not an e...

  7. PALSfit3: A software package for analysing positron lifetime spectra

    Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard

    The present report describes a Windows based computer program called PALSfit3. The purpose of the program is to carry out analyses of spectra that have been measured by positron annihilation lifetime spectroscopy (PALS). PALSfit3 is based on the well tested PATFIT and PALS fit programs, which hav...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)...

  8. Development of a software package for solid-angle calculations using the Monte Carlo method

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-01-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C ++ , has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4. -- Highlights: • This software package (SAC) can give accurate solid-angle values. • SAC calculate solid angles using the Monte Carlo method and it has higher computation speed than Geant4. • A simple but effective variance reduction technique which was put forward by the authors has been applied in SAC. • A visualization function and a graphical user interface are also integrated in SAC

  9. InterFace: A software package for face image warping, averaging, and principal components analysis.

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  10. Maximize Your Investment 10 Key Strategies for Effective Packaged Software Implementations

    Beaubouef, Grady Brett

    2009-01-01

    This is a handbook covering ten principles for packaged software implementations that project managers, business owners, and IT developers should pay attention to. The book also has practical real-world coverage including a sample agenda for conducting business solution modeling, customer case studies, and a road map to implement guiding principles. This book is aimed at enterprise architects, development leads, project managers, business systems analysts, business systems owners, and anyone who wants to implement packaged software effectively. If you are a customer looking to implement COTS s

  11. Development of 'Enhance reconstruction package' software for whole-body PET

    Mizuta, Tetsuro; Imanishi, Tatsuru; Ishikawa, Akihiro

    2011-01-01

    We have developed 'Enhance Reconstruction Package' Software for our whole-body positron emission tomography (PET) Eminence series. This package improves image quality and streamlines the workflow in clinical PET and PET/CT studies. The present paper describes an outline of the applications for data collection, normalization, etc. and also reports some PET images obtained by the software. The signal to noise ratio was optimized in the phantom study, leading to the improvement in image quality. The real time display tool and the remote control tool would make a contribution to enhancement in operability in the routine workflow. (author)

  12. Vertical bone measurements from cone beam computed tomography images using different software packages

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  13. Vertical bone measurements from cone beam computed tomography images using different software packages

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  14. Overview of the MCU Monte Carlo software package

    Kalugin, M.A.; Oleynik, D.S.; Shkarovsky, D.A.

    2013-01-01

    MCU (Monte Carlo Universal) is a project on development and practical use of a universal computer code for simulation of particle transport (neutrons, photons, electrons, positrons) in three-dimensional systems by means of the Monte Carlo method. This paper provides the information on the current state of the project. The developed libraries of constants are briefly described, and the potentialities of the MCU-5 package modules and the executable codes compiled from them are characterized. Examples of important problems of reactor physics solved with the code are presented. It is shown that the MCU constructor tool is able to assemble a full-scale 3D model from templates describing single components using simple and intuitive graphic user interface. The templates are prepared by a skilled user and stored in constructor's templates library. Ordinary user works with the graphic user interface and does not deal with MCU input data directly. At the present moment there are template libraries for several types of reactors

  15. In-field inspection support software: A status report on the Common Inspection On-site Software Package (CIOSP) project

    Novatchev, Dimitre; Titov, Pavel; Siradjov, Bakhtiiar; Vlad, Ioan; Xiao Jing

    2001-01-01

    Full text: IAEA has invested much thought and effort into developing software that can assist inspectors during their inspection work. Experience with such applications has been steadily growing and IAEA has recently commissioned a next-generation software package. This kind of software accommodates inspection tasks that can vary substantially in function depending on the type of installation being inspected as well as ensures that the resulting software package has a wide range of usability and can preclude excessive development of plant-specific applications. The Common Inspection On-site Software Package is being developed in the Department of Safeguards to address the limitations of the existing software and to expand its coverage of the inspection process. CIOSP is 'common' in that it is aimed at providing support for as many facilities as possible with the minimum re-configuration. At the same time it has to cater to varying needs of individual facilities, different instrumentation and verification methods used. A component-based approach was taken to successfully tackle the challenges that the development of this software presented. CIOSP consists of the following major components: A framework into which individual plug-ins supporting various inspection activities can integrate at run-time; A central data store containing all facility configuration data and all data collected during inspections; A local data store, which resides on the inspector's computer, where the current inspection's data is stored; A set of services used by all plug-ins (i.e. data transformation, authentication, replication services etc.). This architecture allows for incremental development and extension of the software with plug-ins that support individual inspection activities. The core set of components along with the framework, the Inventory Verification, Book Examination and Records and Reports Comparison plug-ins have been developed. The development of the Short Notice Random

  16. PyPedal, an open source software package for pedigree analysis

    The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...

  17. Comparison of four software packages for CT lung volumetry in healthy individuals

    Nemec, Stefan F. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Molinari, Francesco [Centre Hospitalier Regional Universitaire de Lille, Department of Radiology, Lille (France); Dufresne, Valerie [CHU de Charleroi - Hopital Vesale, Pneumologie, Montigny-le-Tilleul (Belgium); Gosset, Natacha [CHU Tivoli, Service d' Imagerie Medicale, La Louviere (Belgium); Silva, Mario; Bankier, Alexander A. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States)

    2015-06-01

    To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. (orig.)

  18. Software package for modeling spin-orbit motion in storage rings

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  19. Software package for modeling spin–orbit motion in storage rings

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de [St. Petersburg State University (Russian Federation)

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{sup 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.

  20. MEASURE/ANOMTEST. Anomaly detection software package for the Dodewaard power plant facility. Supplement 1. Extension of measurement analysis part, addition of plot package

    Schoonewelle, H.

    1995-01-01

    The anomaly detection software package installed at the Dodewaard nuclear power plant has been revised with respect to the part of the measurement analysis. A plot package has been added to the package. Signals in which an anomaly has been detected are automatically plotted including the uncertainty margins of the signals. This report gives a description of the revised measurement analysis part and the plot package. Each new routine of the plot package is described briefly and the new input and output files are given. (orig.)

  1. Determination of Polymer Additives-Antioxidants, Ultraviolet Stabilizers, Plasticizers and Photoinitiators in Plastic Food Package by Accelerated Solvent Extraction Coupled with High-Performance Liquid Chromatography.

    Li, Bo; Wang, Zhi-Wei; Lin, Qin-Bao; Hu, Chang-Ying; Su, Qi-Zhi; Wu, Yu-Mei

    2015-07-01

    An analytical method for the quantitative determination of 4 antioxidants, 9 ultraviolet (UV) stabilizers, 12 phthalate plasticizers and 2 photoinitiators in plastic food package using accelerated solvent extraction (ASE) coupled with high-performance liquid chromatography-photodiode array detector (HPLC-PDA) has been developed. Parameters affecting the efficiency in the process such as extraction and chromatographic conditions were studied in order to determine operating conditions. The analytical method of ASE-HPLC showed good linearity with good correlation coefficients (R ≥ 0.9833). The limits of detection and quantification were between 0.03 and 0.30 µg mL(-1) and between 0.10 and 1.00 µg mL(-1) for 27 analytes. Average spiked recoveries for most analytes in samples were >70.4% at 10, 20 and 40 µg g(-1) spiked levels, except UV-9 and Irganox 1010 (58.6 and 64.0% spiked at 10 µg g(-1), respectively), the relative standard deviations were in the range from 0.4 to 15.4%. The methodology has been proposed for the analysis of 27 polymer additives in plastic food package. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. A high performance micro-pressure sensor based on a double-ended quartz tuning fork and silicon diaphragm in atmospheric packaging

    Cheng, Rongjun; Li, Cun; Zhao, Yulong; Li, Bo; Tian, Bian

    2015-01-01

    A resonant micro-pressure sensor based on a double-ended quartz tuning fork (DEQTF) and bossed silicon diaphragm in atmospheric packaging is presented. To achieve vacuum-free packaging with a high quality factor, the DEQTF is designed to resonate in an anti-phase vibration mode in a plane that is under the effect of slide-film damping. The feasibility is demonstrated by theoretical analysis and a finite element simulation. The dimensions of the DEQTF and diaphragm are optimized in accordance with the principles of improving sensitivity and minimizing energy dissipation. The sensor chip is fabricated using quartz and silicon micromachining technologies, and simply packaged in a stainless steel shell with standard atmosphere. The experimental setup is established for the calibration, where an additional sensor prototype without a pressure port is introduced as a frequency reference. By detecting the frequency difference of the tested sensor and reference sensor, the influences of environmental factors such as temperature and shocks on measuring accuracy are eliminated effectively. Under the action of a self-excitation circuit, static performance is obtained. The sensitivity of the sensor is 299 kHz kPa −1 in the operating range of 0–10 kPa at room temperature. Testing results shows a nonlinearity of 0.0278%FS, a hysteresis of 0.0207%FS and a repeatability of 0.0375%FS. The results indicate that the proposed sensor has favorable features, which provides a cost-effective and high-performance approach for low pressure measurement. (paper)

  3. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  4. System Software and Tools for High Performance Computing Environments: A report on the findings of the Pasadena Workshop, April 14--16, 1992

    Sterling, T. [Universities Space Research Association, Washington, DC (United States); Messina, P. [Jet Propulsion Lab., Pasadena, CA (United States); Chen, M. [Yale Univ., New Haven, CT (United States)] [and others

    1993-04-01

    The Pasadena Workshop on System Software and Tools for High Performance Computing Environments was held at the Jet Propulsion Laboratory from April 14 through April 16, 1992. The workshop was sponsored by a number of Federal agencies committed to the advancement of high performance computing (HPC) both as a means to advance their respective missions and as a national resource to enhance American productivity and competitiveness. Over a hundred experts in related fields from industry, academia, and government were invited to participate in this effort to assess the current status of software technology in support of HPC systems. The overall objectives of the workshop were to understand the requirements and current limitations of HPC software technology and to contribute to a basis for establishing new directions in research and development for software technology in HPC environments. This report includes reports written by the participants of the workshop`s seven working groups. Materials presented at the workshop are reproduced in appendices. Additional chapters summarize the findings and analyze their implications for future directions in HPC software technology development.

  5. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  6. Global review of open access risk assessment software packages valid for global or continental scale analysis

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user

  7. Strategy and Software Application of Fresh Produce Package Design to Attain Optimal Modified Atmosphere

    Dong Sun Lee

    2014-01-01

    Full Text Available Modified atmosphere packaging of fresh produce relies on the attainment of desired gas concentration inside the package resulting from product respiration and package’s gas transfer. Systematic package design method to achieve the target modified atmosphere was developed and constructed as software in terms of selecting the most appropriate film, microperforations, and/or CO2 scavenger. It incorporates modeling and/or database construction on the produce respiration, gas transfer across the plastic film and microperforation, and CO2 absorption by the scavenger. The optimization algorithm first selects the packaging film and/or microperforations to have the target O2 concentration in response to the respiration and then tunes the CO2 concentration by CO2 absorber when it goes above its tolerance limit. The optimization method tested for green pepper, strawberry, and king oyster mushroom packages was shown to be effective to design the package and the results obtained were consistent with literature work and experimental atmosphere.

  8. Recent developments on PLASMAKIN - a software package to model the kinetics in gas discharges

    Pinhao, N R

    2009-01-01

    PLASMAKIN is a user-friendly software package to handle physical and chemical data used in plasma physics modeling and to compute the production and destruction terms in fluid models equations. These terms account for the particle or energy production and loss rates due to gas-phase and gas-surface reactions. The package has been restructured and expanded to (a) allow the simulation of atomic emission spectra taking into account line broadening processes and radiation trapping; (b) include a library to compute the electron kinetics; (c) include a database of species properties and reactions and, (d) include a Python interface to allow access from scripts and integration with other scientific software tools.

  9. Three dimensional field computation software package DE3D and its applications

    Fan Mingwu; Zhang Tianjue; Yan Weili

    1992-07-01

    A software package, DE3D that can be run on PC for three dimensional electrostatic and magnetostatic field analysis has been developed in CIAE (China Institute of Atomic Energy). Two scalar potential method and special numerical techniques have made the code with high precision. It can be used for electrostatic and magnetostatic fields computations with complex boundary conditions. In the most cases, the result accuracy is better than 1% comparing with the measured. In some situations, the results are more acceptable than the other codes because some tricks are used for the current integral. Typical examples, design of a cyclotron magnet and magnetic elements on its beam transport line, given in the paper show how the program helps the designer to improve the design of the product. The software package could bring advantages to the producers and designers

  10. Novel applications of the x-ray tracing software package McXtrace

    Bergbäck Knudsen, Erik; Nielsen, Martin Meedom; Haldrup, Kristoffer

    2014-01-01

    We will present examples of applying the X-ray tracing software package McXtrace to different kinds of X-ray scattering experiments. In particular we will be focusing on time-resolved type experiments. Simulations of full scale experiments are particularly useful for this kind, especially when...... some of the issues encountered. Generally more than one or all of these effects are present at once. Simulations can in these cases be used to identify distinct footprints of such distortions and thus give the experimenter a means of deconvoluting them from the signal. We will present a study...... of this kind along with the newest developments of the McXtrace software package....

  11. The GeoSteiner software package for computing Steiner trees in the plane

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner...... approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  12. ACEMAN (II): a PDP-11 software package for acoustic emission analysis

    Tobias, A.

    1976-01-01

    A powerful, but easy-to-use, software package (ACEMAN) for acoustic emission analysis has been developed at Berkeley Nuclear Laboratories. The system is based on a PDP-11 minicomputer with 24 K of memory, an RK05 DISK Drive and a Tektronix 4010 Graphics terminal. The operation of the system is described in detail in terms of the functions performed in response to the various command mnemonics. The ACEMAN software package offers many useful facilities not found on other acoustic emission monitoring systems. Its main features, many of which are unique, are summarised. The ACEMAN system automatically handles arrays of up to 12 sensors in real-time operation during which data are acquired, analysed, stored on the computer disk for future analysis and displayed on the terminal if required. (author)

  13. WannierTools: An open-source software package for novel topological materials

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  14. Determination of stress-strain state of the wooden church log walls with software package

    Chulkova Anastasia

    2016-01-01

    Full Text Available The restoration of architectural monuments is going on all over the world today. The main aim of restoration is the renewal of stable functioning of building constructions in normal state. In this article, we have tried to figure out with special software the bearing capacity of log cabins of the Church of Transfiguration on Kizhi island. As shown in research results, determination of stress-strain stage with software package is necessary for the bearing capacity computation as well as field tests.

  15. PsyToolkit: a software package for programming psychological experiments using Linux.

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  16. Evaluation of Different Software Packages in Flow Modeling under Bridge Structures

    Mohammad Taghi Dastorani

    2007-01-01

    Full Text Available This study is an independent and a comparative research concerning the accuracy, capability and suitability of three well-known packages ofISIS, MIKE11 and HEC-RAS as hydraulic river modeling software packages for modeling the flow through bridges. The research project was designed to assess the ability of each software package to model the flow through bridge structures. It was carried out using the data taken from experiments completed by a 22-meter laboratory flume at theUniversityofBirmingham. The flume has a compound cross section containing a main channel and two flood plains on either side. For this study a smooth main channel and a smooth floodplain have been assumed. Two types of bridges are modeled in this research; a multiple opening semi-circular arch bridge and a single opening straight deck bridge. For each bridge, two different simulations were carried out using two different upstream boundaries as low flow and high flow simulations. According to the results, all three packages were able to model arch and US BPR bridges but in some cases they presented different results. The highest water elevation upstream the bridge (maximum afflux was the main parameter to be compared to the measured values.ISISand HEC-RAS (especially HEC-RAS seem to be more efficient to model arch bridge. However, in some cases, MIKE 11 produced considerably higher results than those of the other two packages. To model USBPR bridge, all three packages produced reasonable results. However, the results by HEC-RAS are the best when the outputs are compared to the experimental data.

  17. Quantitation of magnetic resonance spectroscopy signals: the jMRUI software package

    Stefan, D.; Di Cesare, F.; Andrasescu, A.; Popa, E.; Lazariev, A.; Vescovo, E.; Štrbák, Oliver; Williams, S.; Starčuk jr., Zenon; Cabanas, M.; van Ormondt, D.; Graveron-Demilly, D.

    2009-01-01

    Roč. 20, č. 10 (2009), 104035:1-9 ISSN 0957-0233 Grant - others:EC 6FP(XE) MRTN-CT-2006-035801 Source of funding: R - rámcový projekt EK Keywords : MR spectroscopy * MRS * MRSI * HRMAS-NMR * jMRUI software package * Java * plug-ins * quantitation Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.317, year: 2009

  18. Implementation of the INSPECT software package for statistical calculation in nuclear material accountability

    Marzo, M.A.S.

    1986-01-01

    The INSPECT software package was developed in the Pacific Northwest Laboratory for statistical calculations in nuclear material accountability. The programs apply the inspection and evaluation methodology described in Part of the Safeguards Technical Manual. In this paper the implementation of INSPECT at the Safeguards Division of CNEN, and the main characteristics of INSPECT are described. The potential applications of INSPECT to the nuclear material accountability is presented. (Author) [pt

  19. Software packages for simulating groundwater flow and the spreading of soluble and insoluble admixtures in aquifers

    Roshal, A.A.; Klein, I.S.; Svishchov, A.M.

    1993-01-01

    Software programs are described designed for solving hydrogeological and environmental problems related to the analysis and prediction of groundwater flow and the spreading of solutes and insolubles in the saturated zones. The software package GWFS (Ground Water Flow Simulation) allows for simulating steady-state and unsteady-state flow in confined, unconfined, and confined-unconfined multi-layer and quasi-3D isotropic and anisotropic aquifer systems. Considered are intra-layer sources and sinks, infiltration, inter-layer leakages, the interrelationships with surface reservoirs and streams, interrelationships with the drains, aquifer discharge to surface sources. The MTS (Mass Transport Simulation) package is designed for solving solute transport problems. Taken into account is convective transport, hydrodynamic dispersion and diffusion, linear equilibrium sorption. The method of characteristics is being implemented here using the ''particles-in-cells'' scheme in which the transport is modeled with the help of tracers. The software package OWFS (Oil-Water Flow Simulation) is designed for the simulation of hydrocarbon (oil-water) migration in aquifers

  20. MEGADOCK 4.0: an ultra-high-performance protein-protein docking software for heterogeneous supercomputers.

    Ohue, Masahito; Shimoda, Takehiro; Suzuki, Shuji; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka

    2014-11-15

    The application of protein-protein docking in large-scale interactome analysis is a major challenge in structural bioinformatics and requires huge computing resources. In this work, we present MEGADOCK 4.0, an FFT-based docking software that makes extensive use of recent heterogeneous supercomputers and shows powerful, scalable performance of >97% strong scaling. MEGADOCK 4.0 is written in C++ with OpenMPI and NVIDIA CUDA 5.0 (or later) and is freely available to all academic and non-profit users at: http://www.bi.cs.titech.ac.jp/megadock. akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  1. An open-source software package for multivariate modeling and clustering: applications to air quality management.

    Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong

    2015-09-01

    This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.

  2. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  3. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Rozanov, V.V.; Dinter, T.; Rozanov, A.V.; Wolanin, A.; Bracher, A.; Burrows, J.P.

    2017-01-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean–atmosphere radiative transfer solver presented by Rozanov et al. we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: (http://www.iup.physik.uni-bremen.de). - Highlights: • A new version of the software package SCIATRAN is presented. • Inelastic scattering in water and atmosphere is implemented in SCIATRAN. • Raman scattering and fluorescence can be included in radiative transfer calculations. • Comparisons to other radiative transfer models show excellent agreement. • Comparisons to observations show consistent results.

  4. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  5. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  6. Software package evaluation for the TJ-II Data Acquisition System

    Cremy, C.; Sanchez, E.; Portas, A.; Vega, J.

    1996-01-01

    The TJ-II Data Acquisition System (DAS) has to provide a user interface which will allow setup for sampling channels, discharge signal visualization and reduce data processing, all in run time. On the other hand, the DAS will provide a high level software capability for signal analysis, processing and data visualization either in run time or off line. A set of software packages including Builder Xcessory, X-designer, llog Builder, Toolmaster, AVS 5, AVS/Express, PV-WAVE and Iris Explorer, have been evaluated by the Data Acquisition Group of the Fusion Division. the software evaluation, resumed in this paper, has resulted in a global solution being found which meets all of the DAS requirements. (Author)

  7. SWISTRACK - AN OPEN SOURCE, SOFTWARE PACKAGE APPLICABLE TO TRACKING OF FISH LOCOMOTION AND BEHAVIOUR

    Steffensen, John Fleng

    2010-01-01

    including swimming speed, acceleration and directionality of movements as well as the examination of locomotory panems during swimming. SWiSlrdL:k, a [n: t; and downloadable software package (available from www.sourceforge.com) is widely used for tracking robots, humans and other animals. Accordingly......, Swistrack can be easily adopted for the tracking offish. Benefits associated with the free software include: • Contrast or marker based tracking enabling tracking of either the whole animal, or tagged marks placed upon the animal • The ability to track multiple tags placed upon an individual animal • Highly...... effective background subtraction algorithms and filters ensuring smooth tracking of fish • Application of tags of different colour enables the software to track multiple fish without the problem of track exchange between individuals • Low processing requirements enable tracking in real-time • Further...

  8. Software package for the design and analysis of DNA origami structures

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  9. Improving package structure of object-oriented software using multi-objective optimization and weighted class connections

    Amarjeet

    2017-07-01

    Full Text Available The software maintenance activities performed without following the original design decisions about the package structure usually deteriorate the quality of software modularization, leading to decay of the quality of the system. One of the main reasons for such structural deterioration is inappropriate grouping of source code classes in software packages. To improve such grouping/modular-structure, previous researchers formulated the software remodularization problem as an optimization problem and solved it using search-based meta-heuristic techniques. These optimization approaches aimed at improving the quality metrics values of the structure without considering the original package design decisions, often resulting into a totally new software modularization. The entirely changed software modularization becomes costly to realize as well as difficult to understand for the developers/maintainers. To alleviate this issue, we propose a multi-objective optimization approach to improve the modularization quality of an object-oriented system with minimum possible movement of classes between existing packages of original software modularization. The optimization is performed using NSGA-II, a widely-accepted multi-objective evolutionary algorithm. In order to ensure minimum modification of original package structure, a new approach of computing class relations using weighted strengths has been proposed here. The weights of relations among different classes are computed on the basis of the original package structure. A new objective function has been formulated using these weighted class relations. This objective function drives the optimization process toward better modularization quality simultaneously ensuring preservation of original structure. To evaluate the results of the proposed approach, a series of experiments are conducted over four real-worlds and two random software applications. The experimental results clearly indicate the effectiveness

  10. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  11. GMATA: An Integrated Software Package for Genome-Scale SSR Mining, Marker Development and Viewing.

    Wang, Xuewen; Wang, Le

    2016-01-01

    Simple sequence repeats (SSRs), also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA) integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar.

  12. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  13. The EQ3/6 software package for geochemical modeling: Current status

    Worlery, T.J.; Jackson, K.J.; Bourcier, W.L.; Bruton, C.J.; Viani, B.E.; Knauss, K.G.; Delany, J.M.

    1988-07-01

    EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300 degree C. 60 refs., 2 figs

  14. The EQ3/6 software package for geochemical modeling: Current status

    Wolery, T.J.; Jackson, K.J.; Bourcier, W.L.; Bruton, C.J.; Viani, B.E.; Knauss, K.G.; Delany, J.M.

    1988-07-01

    EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300{degree}C. 60 refs., 2 figs.

  15. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  16. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Ashraf, H.; Bach, K.S.; Hansen, H. [Copenhagen University, Department of Radiology, Gentofte Hospital, Hellerup (Denmark); Hoop, B. de [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Shaker, S.B.; Dirksen, A. [Copenhagen University, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Prokop, M. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Radboud University Nijmegen, Department of Radiology, Nijmegen (Netherlands); Pedersen, J.H. [Copenhagen University, Department of Cardiothoracic Surgery RT, Rigshospitalet, Copenhagen (Denmark)

    2010-08-15

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  17. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Ashraf, H.; Bach, K.S.; Hansen, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  18. A software package to process an INIS magnetic tape on the VAX computer

    Omar, A.A.; Mohamed, F.A.

    1991-01-01

    This paper presents a software package whose function is to process the magnetic tapes distributed by the Atomic Energy Agency, on the VAX computers. These tapes contain abstracts of papers in the different branches of nuclear field and is supplied from the international Nuclear Information system (INIS). Two goals are aimed from this paper. First it gives a procedure to process any foreign magnetic tape on the VAX computers. Second, it solves the problem of reading the INIS tapes on a non IBM computer and thus allowing the specialists to gain from the large amount of information contained in these tapes. 11 fig

  19. SEJV2 software package for radiation monitoring system of WWER 440 NPP

    Kapisovsky, V.; Jancik, O.; Kubik, I.; Bena, J.

    1993-01-01

    The main part of the radiation monitoring system at a WWER-440 (213 reactor type) nuclear power plant is the centralized 400-channel monitoring system 'SEJVAL' servicing twin reactor units. The SEJV2 software package is described developed to run on a PC with an IFS2 interface to the SEJVAL radiation monitoring system. It provides enhanced data presentation, record keeping and report generation, thus improving the efficiency of the health physics shift. The system was for the first time implemented at the Jaslovske Bohunice V-2 nuclear power plant with encouraging results. (Z.S.) 3 refs

  20. VIPEX (Vital-area Identification Package EXpert) Software Verification and Validation

    Jung, Woo Sik; Suh, Jae Seung

    2010-06-01

    The purposes of this report are (1) to perform a Verification and Validation (V and V) test for the VIPEX(Vital-area Identification Package EXpert) software and (2) to improve a software quality through the V and V test. The VIPEX was developed in Korea Atomic Energy Research Institute (KAERI) for the Vital Area Identification (VAI) of nuclear power plants. The version of the VIPEX which was distributed is 3.2.0.0. The VIPEX was revised based on the first V and V test and the second V and V test was performed. We have performed the following tasks for the V and V test on Windows XP and VISTA operating systems: Ο Testing basic functions including fault tree editing Ο Testing all kind of functions Ο Research for update from Visual BASIC 6.0 to Visual BASIC 2008

  1. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. ATK-ForceField: a new generation molecular dynamics software package

    Schneider, Julian; Hamaekers, Jan; Chill, Samuel T.; Smidstrup, Søren; Bulin, Johannes; Thesen, Ralph; Blom, Anders; Stokbro, Kurt

    2017-12-01

    ATK-ForceField is a software package for atomistic simulations using classical interatomic potentials. It is implemented as a part of the Atomistix ToolKit (ATK), which is a Python programming environment that makes it easy to create and analyze both standard and highly customized simulations. This paper will focus on the atomic interaction potentials, molecular dynamics, and geometry optimization features of the software, however, many more advanced modeling features are available. The implementation details of these algorithms and their computational performance will be shown. We present three illustrative examples of the types of calculations that are possible with ATK-ForceField: modeling thermal transport properties in a silicon germanium crystal, vapor deposition of selenium molecules on a selenium surface, and a simulation of creep in a copper polycrystal.

  3. TensorPack: a Maple-based software package for the manipulation of algebraic expressions of tensors in general relativity

    Huf, P A; Carminati, J

    2015-01-01

    In this paper we: (1) introduce TensorPack, a software package for the algebraic manipulation of tensors in covariant index format in Maple; (2) briefly demonstrate the use of the package with an orthonormal tensor proof of the shearfree conjecture for dust. TensorPack is based on the Riemann and Canon tensor software packages and uses their functions to express tensors in an indexed covariant format. TensorPack uses a string representation as input and provides functions for output in index form. It extends the functionality to basic algebra of tensors, substitution, covariant differentiation, contraction, raising/lowering indices, symmetry functions and other accessory functions. The output can be merged with text in the Maple environment to create a full working document with embedded dynamic functionality. The package offers potential for manipulation of indexed algebraic tensor expressions in a flexible software environment. (paper)

  4. User’s Manual for the Simulation of Energy Consumption and Emissions from Rail Traffic Software Package

    Cordiero, Tiago M.; Lindgreen, Erik Bjørn Grønning; Sorenson, Spencer C

    2005-01-01

    The ARTEMIS rail emissions model was implemented in a Microsoft Excel software package that includes data from the GISCO database on railway traffic. This report is the user’s manual for the aforementioned software that includes information on how to run the program and an overview on how...... of Excel Macros (Visual Basic) and database sheets included in one Excel file...

  5. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    Tuura, L.A.; Taylor, L.

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  6. Cross-Platform Learning Media Development of Software Installation on Computer Engineering and Networking Expertise Package

    Afis Pratama

    2018-03-01

    Full Text Available Software Installation is one of the important lessons that must be mastered by student of computer and network engineering expertise package. But there is a problem about the lack of attention and concentration of students in following the teaching and learning process in the subject of installation of the software. The matter must immediately find a solution. This research refers to the technology development that is always increasing. The technology can be used as a tool to support learning activities. Currently, all grade 10 students in public vocational high school (SMK 8 Semarang Indonesia already have a gadget, either a smartphone or a laptop and the intensity of usage is high enough. Based on this phenomenon, this research aims to create a learning media software installation that is cross-platform. It is practical and can be carried easily in a smartphone and a laptop that has different operating system. So that, this media is expected to improve learning outcomes, understanding and enthusiasm of the students in the software installation lesson.

  7. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  8. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  9. Diffusion tensor imaging of the median nerve: intra-, inter-reader agreement, and agreement between two software packages

    Guggenberger, Roman; Nanz, Daniel; Puippe, Gilbert; Andreisek, Gustav; Rufibach, Kaspar; White, Lawrence M.; Sussman, Marshall S.

    2012-01-01

    To assess intra-, inter-reader agreement, and the agreement between two software packages for magnetic resonance diffusion tensor imaging (DTI) measurements of the median nerve. Fifteen healthy volunteers (seven men, eight women; mean age, 31.2 years) underwent DTI of both wrists at 1.5 T. Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) of the median nerve were measured by three readers using two commonly used software packages. Measurements were repeated by two readers after 6 weeks. Intraclass correlation coefficients (ICC) and Bland-Altman analysis were used for statistical analysis. ICCs for intra-reader agreement ranged from 0.87 to 0.99, for inter-reader agreement from 0.62 to 0.83, and between the two software packages from 0.63 to 0.82. Bland-Altman analysis showed no differences for intra- and inter-reader agreement and agreement between software packages. The intra-, inter-reader, and agreement between software packages for DTI measurements of the median nerve were moderate to substantial suggesting that user- and software-dependent factors contribute little to variance in DTI measurements. (orig.)

  10. ldr: An R Software Package for Likelihood-Based Su?cient Dimension Reduction

    Kofi Placid Adragni

    2014-11-01

    Full Text Available In regression settings, a su?cient dimension reduction (SDR method seeks the core information in a p-vector predictor that completely captures its relationship with a response. The reduced predictor may reside in a lower dimension d < p, improving ability to visualize data and predict future observations, and mitigating dimensionality issues when carrying out further analysis. We introduce ldr, a new R software package that implements three recently proposed likelihood-based methods for SDR: covariance reduction, likelihood acquired directions, and principal fitted components. All three methods reduce the dimensionality of the data by pro jection into lower dimensional subspaces. The package also implements a variable screening method built upon principal ?tted components which makes use of ?exible basis functions to capture the dependencies between the predictors and the response. Examples are given to demonstrate likelihood-based SDR analyses using ldr, including estimation of the dimension of reduction subspaces and selection of basis functions. The ldr package provides a framework that we hope to grow into a comprehensive library of likelihood-based SDR methodologies.

  11. SeDA: A software package for the statistical analysis of the instrument drift

    Lee, H. J.; Jang, S. C.; Lim, T. J.

    2006-01-01

    The setpoints for safety-related equipment are affected by many sources of an uncertainty. ANSI/ISA-S67.04.01-2000 [1] and ISA-RP6 7.04.02-2000 [2] suggested the statistical approaches for ensuring that the safety-related instrument setpoints were established and maintained within the technical specification limits [3]. However, Jang et al. [4] indicated that the preceding methodologies for a setpoint drift analysis might be insufficient to manage a setpoint drift on an instrumentation device and proposed new statistical analysis procedures for the management of a setpoint drift, based on the plant specific as-found/as-left data. Although IHPA (Instrument History Performance Analysis) is a widely known commercial software package to analyze an instrument setpoint drift, several steps in the new procedure cannot be performed by using it because it is based on the statistical approaches suggested in the ANSI/ISA-S67.04.01 -2000 [1] and ISA-RP67.04.02-2000 [2], In this paper we present a software package (SeDA: Setpoint Drift Analysis) that implements new methodologies, and which is easy to use, as it is accompanied by powerful graphical tools. (authors)

  12. Graphical representation of ribosomal RNA probe accessibility data using ARB software package

    Amann Rudolf

    2005-03-01

    Full Text Available Abstract Background Taxon specific hybridization probes in combination with a variety of commonly used hybridization formats nowadays are standard tools in microbial identification. A frequently applied technology, fluorescence in situ hybridization (FISH, besides single cell identification, allows the localization and functional studies of the microbial community composition. Careful in silico design and evaluation of potential oligonucleotide probe targets is therefore crucial for performing successful hybridization experiments. Results The PROBE Design tools of the ARB software package take into consideration several criteria such as number, position and quality of diagnostic sequence differences while designing oligonucleotide probes. Additionally, new visualization tools were developed to enable the user to easily examine further sequence associated criteria such as higher order structure, conservation, G+C content, transition-transversion profiles and in situ target accessibility patterns. The different types of sequence associated information (SAI can be visualized by user defined background colors within the ARB primary and secondary structure editors as well as in the PROBE Match tool. Conclusion Using this tool, in silico probe design and evaluation can be performed with respect to in situ probe accessibility data. The evaluation of proposed probe targets with respect to higher-order rRNA structure is of importance for successful design and performance of in situ hybridization experiments. The entire ARB software package along with the probe accessibility data is available from the ARB home page http://www.arb-home.de.

  13. PHYLUCE is a software package for the analysis of conserved genomic loci.

    Faircloth, Brant C

    2016-03-01

    Targeted enrichment of conserved and ultraconserved genomic elements allows universal collection of phylogenomic data from hundreds of species at multiple time scales ( 300 Ma). Prior to downstream inference, data from these types of targeted enrichment studies must undergo preprocessing to assemble contigs from sequence data; identify targeted, enriched loci from the off-target background data; align enriched contigs representing conserved loci to one another; and prepare and manipulate these alignments for subsequent phylogenomic inference. PHYLUCE is an efficient and easy-to-install software package that accomplishes these tasks across hundreds of taxa and thousands of enriched loci. PHYLUCE is written for Python 2.7. PHYLUCE is supported on OSX and Linux (RedHat/CentOS) operating systems. PHYLUCE source code is distributed under a BSD-style license from https://www.github.com/faircloth-lab/phyluce/ PHYLUCE is also available as a package (https://binstar.org/faircloth-lab/phyluce) for the Anaconda Python distribution that installs all dependencies, and users can request a PHYLUCE instance on iPlant Atmosphere (tag: phyluce). The software manual and a tutorial are available from http://phyluce.readthedocs.org/en/latest/ and test data are available from doi: 10.6084/m9.figshare.1284521. brant@faircloth-lab.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  15. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  16. A PC-based software package for modeling DOE mixed-waste management options

    Abashian, M.S.; Carney, C.; Schum, K.

    1995-02-01

    The U.S. Department of Energy (DOE) Headquarters and associated contractors have developed an IBM PC-based software package that estimates costs, schedules, and public and occupational health risks for a range of mixed-waste management options. A key application of the software package is the comparison of various waste-treatment options documented in the draft Site Treatment Plans prepared in accordance with the requirements of the Federal Facility Compliance Act of 1992. This automated Systems Analysis Methodology consists of a user interface for configuring complexwide or site-specific waste-management options; calculational algorithms for cost, schedule and risk; and user-selected graphical or tabular output of results. The mixed-waste management activities modeled in the automated Systems Analysis Methodology include waste storage, characterization, handling, transportation, treatment, and disposal. Analyses of treatment options identified in the draft Site Treatment Plans suggest potential cost and schedule savings from consolidation of proposed treatment facilities. This paper presents an overview of the automated Systems Analysis Methodology

  17. REIDAC. A software package for retrospective dose assessment in internal contamination with radionuclides

    Kurihara, Osamu; Kanai, Katsuta; Takada, Chie; Takasaki, Koji; Ito, Kimio; Momose, Takumaro; Hato, Shinji; Ikeda, Hiroshi; Oeda, Mikihiro; Kurosawa, Naohiro; Fukutsu, Kumiko; Yamada, Yuji; Akashi, Makoto

    2007-01-01

    For cases of internal contamination with radionuclides, it is necessary to perform an internal dose assessment to facilitate radiation protection. For this purpose, the ICRP has supplied the dose coefficients and the retention and excretion rates for various radionuclides. However, these dosimetric quantities are calculated under typical conditions and are not necessarily detailed enough for dose assessment situations in which specific information on the incident or/and individual biokinetic characteristics could or should be taken into account retrospectively. This paper describes a newly developed PC-based software package called Retrospective Internal Dose Assessment Code (REIDAC) that meets the needs of retrospective dose assessment. REIDAC is made up of a series of calculation programs and a package of software. The former calculates the dosimetric quantities for any radionuclide being assessed and the latter provides a user with the graphical user interface (GUI) for executing the programs, editing parameter values and displaying results. The accuracy of REIDAC was verified by comparisons with dosimetric quantities given in the ICRP publications. This paper presents the basic structure of REIDAC and its calculation methods. Sensitivity analysis of the aerosol size for 239 Pu compounds and provisional calculations for wound contamination with 241 Am were performed as examples of the practical application of REIDAC. (author)

  18. METEOR v1.0 - Design and structure of the software package

    Palomo, E.

    1994-01-01

    This script describes the structure and the separated modules of the software package METEOR for the statistical analysis of meteorological data series. It contains a systematic description of the subroutines of METEOR and, also, of the required shape for input and output files. The original version of METEOR have been developed by Ph.D. Elena Palomo, CIEMAT-IER, GIMASE. It is built by linking programs and routines written in FORTRAN 77 and it adds thc graphical capabilities of GNUPLOT. The shape of this toolbox was designed following the criteria of modularity, flexibility and agility criteria. All the input, output and analysis options are structured in three main menus: i) the first is aimed to evaluate the quality of the data set; ii) the second is aimed for pre-processing of the data; and iii) the third is aimed towards the statistical analyses and for creating the graphical outputs. Actually the information about METEOR is constituted by three documents written in spanish: 1) METEOR v1.0: User's guide; 2) METEOR v1.0: A usage example; 3) METEOR v 1.0: Design and structure of the software package. (Author)

  19. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: II. Algorithms.

    Appel, R D; Vargas, J R; Palagi, P M; Walther, D; Hochstrasser, D F

    1997-12-01

    After two generations of software systems for the analysis of two-dimensional electrophoresis (2-DE) images, a third generation of such software packages has recently emerged that combines state-of-the-art graphical user interfaces with comprehensive spot data analysis capabilities. A key characteristic common to most of these software packages is that many of their tools are implementations of algorithms that resulted from research areas such as image processing, vision, artificial intelligence or machine learning. This article presents the main algorithms implemented in the Melanie II 2-D PAGE software package. The applications of these algorithms, embodied as the feature of the program, are explained in an accompanying article (R. D. Appel et al.; Electrophoresis 1997, 18, 2724-2734).

  20. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    Hernandez, F.; Gonzalez-Manrique, S.; Karlsson, L.; Hernandez-Armas, J.; Aparicio, A.

    2007-01-01

    Makrofol detectors are commonly used for long-term radon ( 222 Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm -3 htrack -1 cm 2 , has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm -3 and, in two of them, above 400Bqm -3 . Further studies should be performed at those schools following the European Union recommendations about radon concentrations in buildings

  1. The Caviar software package for the astrometric reduction of Cassini ISS images: description and examples

    Cooper, N. J.; Lainey, V.; Meunier, L.-E.; Murray, C. D.; Zhang, Q.-F.; Baillie, K.; Evans, M. W.; Thuillot, W.; Vienne, A.

    2018-02-01

    Aims: Caviar is a software package designed for the astrometric measurement of natural satellite positions in images taken using the Imaging Science Subsystem (ISS) of the Cassini spacecraft. Aspects of the structure, functionality, and use of the software are described, and examples are provided. The integrity of the software is demonstrated by generating new measurements of the positions of selected major satellites of Saturn, 2013-2016, along with their observed minus computed (O-C) residuals relative to published ephemerides. Methods: Satellite positions were estimated by fitting a model to the imaged limbs of the target satellites. Corrections to the nominal spacecraft pointing were computed using background star positions based on the UCAC5 and Tycho2 star catalogues. UCAC5 is currently used in preference to Gaia-DR1 because of the availability of proper motion information in UCAC5. Results: The Caviar package is available for free download. A total of 256 new astrometric observations of the Saturnian moons Mimas (44), Tethys (58), Dione (55), Rhea (33), Iapetus (63), and Hyperion (3) have been made, in addition to opportunistic detections of Pandora (20), Enceladus (4), Janus (2), and Helene (5), giving an overall total of 287 new detections. Mean observed-minus-computed residuals for the main moons relative to the JPL SAT375 ephemeris were - 0.66 ± 1.30 pixels in the line direction and 0.05 ± 1.47 pixels in the sample direction. Mean residuals relative to the IMCCE NOE-6-2015-MAIN-coorb2 ephemeris were -0.34 ± 0.91 pixels in the line direction and 0.15 ± 1.65 pixels in the sample direction. The reduced astrometric data are provided in the form of satellite positions for each image. The reference star positions are included in order to allow reprocessing at some later date using improved star catalogues, such as later releases of Gaia, without the need to re-estimate the imaged star positions. The Caviar software is available for free download from: ftp://ftp://ftp.imcce.fr/pub/softwares

  2. Integrated software package for nuclear material safeguards in a MOX fuel fabrication facility

    Schreiber, H.J.; Piana, M.; Moussalli, G.; Saukkonen, H.

    2000-01-01

    Since computerized data processing was introduced to Safeguards at large bulk handling facilities, a large number of individual software applications have been developed for nuclear material Safeguards implementation. Facility inventory and flow data are provided in computerized format for performing stratification, sample size calculation and selection of samples for destructive and non-destructive assay. Data is collected from nuclear measurement systems running in attended, unattended mode and more recently from remote monitoring systems controlled. Data sets from various sources have to be evaluated for Safeguards purposes, such as raw data, processed data and conclusions drawn from data evaluation results. They are reported in computerized format at the International Atomic Energy Agency headquarters and feedback from the Agency's mainframe computer system is used to prepare and support Safeguards inspection activities. The integration of all such data originating from various sources cannot be ensured without the existence of a common data format and a database system. This paper describes the fundamental relations between data streams, individual data processing tools, data evaluation results and requirements for an integrated software solution to facilitate nuclear material Safeguards at a bulk handling facility. The paper also explains the basis for designing a software package to manage data streams from various data sources and for incorporating diverse data processing tools that until now have been used independently from each other and under different computer operating systems. (author)

  3. LipiDex: An Integrated Software Package for High-Confidence Lipid Identification.

    Hutchins, Paul D; Russell, Jason D; Coon, Joshua J

    2018-04-17

    State-of-the-art proteomics software routinely quantifies thousands of peptides per experiment with minimal need for manual validation or processing of data. For the emerging field of discovery lipidomics via liquid chromatography-tandem mass spectrometry (LC-MS/MS), comparably mature informatics tools do not exist. Here, we introduce LipiDex, a freely available software suite that unifies and automates all stages of lipid identification, reducing hands-on processing time from hours to minutes for even the most expansive datasets. LipiDex utilizes flexible in silico fragmentation templates and lipid-optimized MS/MS spectral matching routines to confidently identify and track hundreds of lipid species and unknown compounds from diverse sample matrices. Unique spectral and chromatographic peak purity algorithms accurately quantify co-isolation and co-elution of isobaric lipids, generating identifications that match the structural resolution afforded by the LC-MS/MS experiment. During final data filtering, ionization artifacts are removed to significantly reduce dataset redundancy. LipiDex interfaces with several LC-MS/MS software packages, enabling robust lipid identification to be readily incorporated into pre-existing data workflows. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  5. MAPPIX: A software package for off-line micro-pixe single particle aerosol analysis

    Ceccato, D.

    2009-01-01

    In the framework of a multiannual experiment performed at Baia Terra Nova, Antarctica, size-segregated aerosol samples were collected by using a 12-stage SDI impactor (Hillamo design). Approximately 2800 particles, belonging to the first four supermicrometric SDI stages - 8.39, 4.08, 2.68, 1.66 μm dynamic aerosol diameter cuts - were analyzed at the INFN-LNL micro-PIXE facility, a three lens Oxford Microprobe (OM) product, installed in the early nineties. Four regions on each of the 12 sub-samples were measured; 60 aerosol particles were detected on average in each of the analyzed regions. The off-line single aerosol particle (SAP) analysis of such big amount of data required software that is able to rapidly handle the acquired data, with a simple and fast area selection procedure; the subsequent automated PIXE spectra analysis with a specialized code was also needed. The MAPPIX 2.0 software was designed to make easier and faster the user jobs during the SAP analysis. The package is composed of two separate routines: the first one is devoted to data format conversion (OM-LMF file format to MAPPIX format), while the second one is devoted to micro-PIXE maps graphical presentation and aerosol particle selection procedure. The MAPPIX data format and software features will be discussed; a short report of the speed performances will be presented.

  6. RpeakChrom: Novel R package for the automated characterization and optimization of column efficiency in high-performance liquid chromatography analysis.

    Peris-Díaz, Manuel David; Alcoriza-Balaguer, Maria Isabel; García-Cañaveras, Juan Carlos; Santonja, Francisco; Sentandreu, Enrique; Lahoz, Agustín

    2017-11-01

    Characterization of chromatographic columns using the traditional van Deemter method is limited by the necessity of calculating extra-column variance, issue particularly relevant when modeling asymmetrical peaks eluted from monolithic columns. A novel R package that implements Parabolic Variance Modified Gaussian approach for accurate peak modeling, van Deemter equation and two alternatives approaches, based on van Deemter, has been developed to calculate the height equivalent to a theoretical plate (HETP). To assess package capabilities conventional packed reverse-phase and monolithic HPLC columns were characterized. Peaks eluted from the monolithic column showed a high value of factor asymmetry due, in part, to the contribution of extra-column factors. Such deviation can be circumvented by the two alternatives approaches implemented in the R-package. Furthermore, increased values of eddy diffusion and mass transfer kinetics terms in HETP were observed for the packed column, while accuracy was below 9% in all cases. These results showed the usefulness of the R-package for both modeling chromatographic peaks and assessing column efficiency. The RpeakChrom package could become a helpful tool for testing new stationary phases during column development and to evaluate column during its lifetime. This R tool is freely available from CRAN (https://CRAN.R-project.org/package=RpeakChrom). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Package

    Arsić Zoran

    2013-01-01

    Full Text Available It is duty of the seller to pack the goods in a manner which assures their safe arrival and enables their handling in transit and at the place of destination. The problem of packing is relevant in two main respects. First of all the buyer is in certain circumstances entitled to refuse acceptance of the goods if they are not properly packed. Second, the package is relevant to calculation of price and freight based on weight. In the case of export trade, the package should conform to the legislation in the country of destination. The impact of package on environment is regulated by environment protection regulation of Republic if Serbia.

  8. DISPL: a software package for one and two spatially dimensioned kinetics-diffusion problems. [FORTRAN for IBM computers

    Leaf, G K; Minkoff, M; Byrne, G D; Sorensen, D; Bleakney, T; Saltzman, J

    1978-11-01

    DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous media. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 17 figures, 9 tables.

  9. Quantitation of magnetic resonance spectroscopy signals: the jMRUI software package

    Stefan, D; Andrasescu, A; Cesare, F Di; Popa, E; Lazariev, A; Graveron-Demilly, D; Vescovo, E; Williams, S; Strbak, O; Starcuk, Z; Cabanas, M; Van Ormondt, D

    2009-01-01

    The software package jMRUI with Java-based graphical user interface enables user-friendly time-domain analysis of magnetic resonance spectroscopy (MRS) and spectroscopic imaging (MRSI) and HRMAS-NMR signals. Version 3.x has been distributed in more than 1200 groups or hospitals worldwide. The new version 4.x is a plug-in platform enabling the users to add their own algorithms. Moreover, it offers new functionalities compared to versions 3.x. The quantum-mechanical simulator based on NMR-SCOPE, the quantitation algorithm QUEST and the main MRSI functionalities are described. Quantitation results of signals obtained in vivo from a mouse and a human brain are given

  10. VISUAL: a software package for plotting data in the RADHEAT-V4 code system

    Sasaki, Toshihiko; Yamano, Naoki

    1984-03-01

    In this report, the features, the capabilities and the constitution of the VISUAL Software Package are presented. The one of the features is that the VISUAL provides a versatile graphic display tool to plot a wide variety of data of the RADHEAT-V4 code system. And the other is to enable a user to handle easily the executing data in the Conversational Management Mode named ''CMM''. The program adopts the adjustable dimension system to increase its flexibility. VISUAL generates two-dimensional drawing, contour line map and three dimensional drawing on TSS (Time Sharing System) digital graphic equipment, NLP (Nihongo Laser Printer) or COM(Computer Output Microfilm). It is easily possible to display the calculated and experimental data in a DATA-POOL by using these functions. The purpose of this report is to describe sufficient information to enable a user to use VISUAL profitabily. (author)

  11. Software package to automate the design and production of translucent building structures made of pvc

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  12. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  13. Development of the processing software package for RPV neutron fluence determination methodology

    Belousov, S.; Kirilova, K.; Ilieva, K.

    2001-01-01

    According to the INRNE methodology the neutron transport calculation is carried out by two steps. At the first step reactor core eigenvalue calculation is performed. This calculation is used for determination of the fixed source for the next step calculation of neutron transport from the reactor core to the RPV. Both calculation steps are performed by state of the art and tested codes. The interface software package DOSRC developed at INRNE is used as a link between these two calculations. The package transforms reactor core calculation results to neutron source input data in format appropriate for the neutron transport codes (DORT, TORT and ASYNT) based on the discrete ordinates method. These codes are applied for calculation of the RPV neutron flux and its responses - induced activity, radiation damage, neutron fluence etc. Fore more precise estimation of the neutron fluence, the INRNE methodology has been supplemented by the next improvements: - implementation of more advanced codes (PYTHIA/DERAB) for neutron-physics parameter calculations; - more detailed neutron source presentation; - verification of neutron fluence by statistically treated experimental data. (author)

  14. A software package for evaluating the performance of a star sensor operation

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  15. Advances in the development of the PIXEKLM-TPI software package

    Uzonyi, I.; Szabo, Gy.

    2005-01-01

    Complete text of publication follows. During the past decade great effort has been devoted to the developments of various local analytical methods which are capable to analyze small volumes of a sample (in the range of some μm 3 ) by high lateral and/or depth resolution. Among the Ion Beam Analytical (IBA) methods, Particle Induced X-Ray Fluorescence Emission (PIXE) analysis has been used for qualitative elemental imaging for a long time. Nevertheless, production of quantitative images is still a challenging and unresolved problem in general. Ryan and his co-workers were the first who developed a software package (GeoPIXE) for on-line quantitative mapping which is capable to analyze especially thick samples. Some years ago we also started to develop quantitative PIXE imaging software and suggested a different approach for the compensation of matrix effects and sample thickness. It is based on the rapid matrix transform method called Dynamic Analysis which directly converts the spectrum vector (S) into the concentration vector (C) in terms of the matrix Γ. We modified the earlier version of the PIXEKLM program in order to calculate the Γ matrix for materials of any thickness. Furthermore, we have developed a windows-based program (True PIXE Imaging, TPI) which calculates elemental distributions on a pixel by pixel basis and creates so called elemental images from them in bit map form using colour bars. The basic part of the new program package was published in 2005. During the past year much efforts has been devoted to develop various new options such as visualization of spectrum components in order to make the program more user-friendly and applicable. In the figure below the decomposed PIXE spectrum of an industrial material is visualized. (author)

  16. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  17. INSPECT: A graphical user interface software package for IDARC-2D

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  18. Mirion--a software package for automatic processing of mass spectrometric images.

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  19. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  20. Calculation of chemical equilibrium between aqueous solution and minerals: the EQ3/6 software package

    Wolery, T.J.

    1979-01-01

    The newly developed EQ/36 software package computes equilibrium models of aqueous geochemical systems. The package contains two principal programs: EQ3 performs distribution-of-species calculations for natural water compositions; EQ6 uses the results of EQ3 to predict the consequences of heating and cooling aqueous solutions and of irreversible reaction in rock--water systems. The programs are valuable for studying such phenomena as the formation of ore bodies, scaling and plugging in geothermal development, and the long-term disposal of nuclear waste. EQ3 and EQ6 are compared with such well-known geochemical codes as SOLMNEQ, WATEQ, REDEQL, MINEQL, and PATHI. The data base allows calculations in the temperature interval 0 to 350 0 C, at either 1 atm-steam saturation pressures or a constant 500 bars. The activity coefficient approximations for aqueous solutes limit modeling to solutions of ionic strength less than about one molal. The mathematical derivations and numerical techniques used in EQ6 are presented in detail. The program uses the Newton--Raphson method to solve the governing equations of chemical equilibrium for a system of specified elemental composition at fixed temperature and pressure. Convergence is aided by optimizing starting estimates and by under-relaxation techniques. The minerals present in the stable phase assemblage are found by several empirical methods. Reaction path models may be generated by using this approach in conjunction with finite differences. This method is analogous to applying high-order predictor--corrector methods to integrate a corresponding set of ordinary differential equations, but avoids propagation of error (drift). 8 figures, 9 tables

  1. ANALYSIS OF CELLULAR REACTION TO IFN-γ STIMULATION BY A SOFTWARE PACKAGE GeneExpressionAnalyser

    A. V. Saetchnikov

    2014-01-01

    Full Text Available The software package GeneExpressionAnalyser for analysis of the DNA microarray experi-mental data has been developed. The algorithms of data analysis, differentially expressed genes and biological functions of the cell are described. The efficiency of the developed package is tested on the published experimental data devoted to the time-course research of the changes in the human cell un-der the influence of IFN-γ on melanoma. The developed software has a number of advantages over the existing software: it is free, has a simple and intuitive graphical interface, allows to analyze different types of DNA microarrays, contains a set of methods for complete data analysis and performs effec-tive gene annotation for a selected list of genes.

  2. A Versatile Software Package for Inter-subject Correlation Based Analyses of fMRI

    Jukka-Pekka eKauppi

    2014-01-01

    Full Text Available In the inter-subject correlation (ISC based analysis of the functional magnetic resonance imaging (fMRI data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modelling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine or Open Grid Scheduler and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/.

  3. A versatile software package for inter-subject correlation based analyses of fMRI.

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  4. Establishing the Common Community Physics Package by Transitioning the GFS Physics to a Collaborative Software Framework

    Xue, L.; Firl, G.; Zhang, M.; Jimenez, P. A.; Gill, D.; Carson, L.; Bernardet, L.; Brown, T.; Dudhia, J.; Nance, L. B.; Stark, D. R.

    2017-12-01

    The Global Model Test Bed (GMTB) has been established to support the evolution of atmospheric physical parameterizations in NCEP global modeling applications. To accelerate the transition to the Next Generation Global Prediction System (NGGPS), a collaborative model development framework known as the Common Community Physics Package (CCPP) is created within the GMTB to facilitate engagement from the broad community on physics experimentation and development. A key component to this Research to Operation (R2O) software framework is the Interoperable Physics Driver (IPD) that hooks the physics parameterizations from one end to the dynamical cores on the other end with minimum implementation effort. To initiate the CCPP, scientists and engineers from the GMTB separated and refactored the GFS physics. This exercise demonstrated the process of creating IPD-compliant code and can serve as an example for other physics schemes to do the same and be considered for inclusion into the CCPP. Further benefits to this process include run-time physics suite configuration and considerably reduced effort for testing modifications to physics suites through GMTB's physics test harness. The implementation will be described and the preliminary results will be presented at the conference.

  5. Efficient Calculation of Exact Exchange Within the Quantum Espresso Software Package

    Barnes, Taylor; Kurth, Thorsten; Carrier, Pierre; Wichmann, Nathan; Prendergast, David; Kent, Paul; Deslippe, Jack

    Accurate simulation of condensed matter at the nanoscale requires careful treatment of the exchange interaction between electrons. In the context of plane-wave DFT, these interactions are typically represented through the use of approximate functionals. Greater accuracy can often be obtained through the use of functionals that incorporate some fraction of exact exchange; however, evaluation of the exact exchange potential is often prohibitively expensive. We present an improved algorithm for the parallel computation of exact exchange in Quantum Espresso, an open-source software package for plane-wave DFT simulation. Through the use of aggressive load balancing and on-the-fly transformation of internal data structures, our code exhibits speedups of approximately an order of magnitude for practical calculations. Additional optimizations are presented targeting the many-core Intel Xeon-Phi ``Knights Landing'' architecture, which largely powers NERSC's new Cori system. We demonstrate the successful application of the code to difficult problems, including simulation of water at a platinum interface and computation of the X-ray absorption spectra of transition metal oxides.

  6. QUENCH: A software package for the determination of quenching curves in Liquid Scintillation counting.

    Cassette, Philippe

    2016-03-01

    In Liquid Scintillation Counting (LSC), the scintillating source is part of the measurement system and its detection efficiency varies with the scintillator used, the vial and the volume and the chemistry of the sample. The detection efficiency is generally determined using a quenching curve, describing, for a specific radionuclide, the relationship between a quenching index given by the counter and the detection efficiency. A quenched set of LS standard sources are prepared by adding a quenching agent and the quenching index and detection efficiency are determined for each source. Then a simple formula is fitted to the experimental points to define the quenching curve function. The paper describes a software package specifically devoted to the determination of quenching curves with uncertainties. The experimental measurements are described by their quenching index and detection efficiency with uncertainties on both quantities. Random Gaussian fluctuations of these experimental measurements are sampled and a polynomial or logarithmic function is fitted on each fluctuation by χ(2) minimization. This Monte Carlo procedure is repeated many times and eventually the arithmetic mean and the experimental standard deviation of each parameter are calculated, together with the covariances between these parameters. Using these parameters, the detection efficiency, corresponding to an arbitrary quenching index within the measured range, can be calculated. The associated uncertainty is calculated with the law of propagation of variances, including the covariance terms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. MulRF: a software package for phylogenetic analysis using multi-copy gene trees.

    Chaudhary, Ruchi; Fernández-Baca, David; Burleigh, John Gordon

    2015-02-01

    MulRF is a platform-independent software package for phylogenetic analysis using multi-copy gene trees. It seeks the species tree that minimizes the Robinson-Foulds (RF) distance to the input trees using a generalization of the RF distance to multi-labeled trees. The underlying generic tree distance measure and fast running time make MulRF useful for inferring phylogenies from large collections of gene trees, in which multiple evolutionary processes as well as phylogenetic error may contribute to gene tree discord. MulRF implements several features for customizing the species tree search and assessing the results, and it provides a user-friendly graphical user interface (GUI) with tree visualization. The species tree search is implemented in C++ and the GUI in Java Swing. MulRF's executable as well as sample datasets and manual are available at http://genome.cs.iastate.edu/CBL/MulRF/, and the source code is available at https://github.com/ruchiherself/MulRFRepo. ruchic@ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. EPILAB: a software package for studies on the prediction of epileptic seizures.

    Teixeira, C A; Direito, B; Feldwisch-Drentrup, H; Valderrama, M; Costa, R P; Alvarado-Rojas, C; Nikolopoulos, S; Le Van Quyen, M; Timmer, J; Schelter, B; Dourado, A

    2011-09-15

    A Matlab®-based software package, EPILAB, was developed for supporting researchers in performing studies on the prediction of epileptic seizures. It provides an intuitive and convenient graphical user interface. Fundamental concepts that are crucial for epileptic seizure prediction studies were implemented. This includes, for example, the development and statistical validation of prediction methodologies in long-term continuous recordings. Seizure prediction is usually based on electroencephalography (EEG) and electrocardiography (ECG) signals. EPILAB is able to process both EEG and ECG data stored in different formats. More than 35 time and frequency domain measures (features) can be extracted based on univariate and multivariate data analysis. These features can be post-processed and used for prediction purposes. The predictions may be conducted based on optimized thresholds or by applying classifications methods such as artificial neural networks, cellular neuronal networks, and support vector machines. EPILAB proved to be an efficient tool for seizure prediction, and aims to be a way to communicate, evaluate, and compare results and data among the seizure prediction community. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. VIBA-LAB2: a virtual ion beam analysis laboratory software package incorporating elemental map simulations

    Zhou, S.J.; Orlic, I.; Sanchez, J.L.; Watt, F.

    1999-01-01

    The software package VIBA-lab1, which incorporates PIXE and RBS energy spectra simulation has now been extended to include the simulation of elemental maps from 3D structures. VIBA-lab1 allows the user to define a wide variety of experimental parameters, e.g. energy and species of incident ions, excitation and detection geometry, etc. When the relevant experimental parameters as well as target composition are defined, the program can then simulate the corresponding PIXE and RBS spectra. VIBA-LAB2 has been written with applications in nuclear microscopy in mind. A set of drag-and-drop tools has been incorporated to allow the user to define a three-dimensional sample object of mixed elemental composition. PIXE energy spectra simulations are then carried out on pixel-by-pixel basis and the corresponding intensity distributions or elemental maps can be computed. Several simulated intensity distributions for some 3D objects are demonstrated, and simulations obtained from a simple IC are compared with experimental results

  10. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker, Charles L. III; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-01-01

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  11. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z

    2011-01-01

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 μm each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  12. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z, E-mail: daniel.turecek@utef.cvut.cz [Institute of Experimental and Applied Physics, Czech Technical University in Prague, Horska 3a/22, 12800 Prague 2 (Czech Republic)

    2011-01-15

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 {mu}m each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  13. Multi-Language Programming Environments for High Performance Java Computing

    Vladimir Getov; Paul Gray; Sava Mintchev; Vaidy Sunderam

    1999-01-01

    Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI) tool which provides ...

  14. QUENCH: A software package for the determination of quenching curves in Liquid Scintillation counting

    Cassette, Philippe

    2016-01-01

    In Liquid Scintillation Counting (LSC), the scintillating source is part of the measurement system and its detection efficiency varies with the scintillator used, the vial and the volume and the chemistry of the sample. The detection efficiency is generally determined using a quenching curve, describing, for a specific radionuclide, the relationship between a quenching index given by the counter and the detection efficiency. A quenched set of LS standard sources are prepared by adding a quenching agent and the quenching index and detection efficiency are determined for each source. Then a simple formula is fitted to the experimental points to define the quenching curve function. The paper describes a software package specifically devoted to the determination of quenching curves with uncertainties. The experimental measurements are described by their quenching index and detection efficiency with uncertainties on both quantities. Random Gaussian fluctuations of these experimental measurements are sampled and a polynomial or logarithmic function is fitted on each fluctuation by χ"2 minimization. This Monte Carlo procedure is repeated many times and eventually the arithmetic mean and the experimental standard deviation of each parameter are calculated, together with the covariances between these parameters. Using these parameters, the detection efficiency, corresponding to an arbitrary quenching index within the measured range, can be calculated. The associated uncertainty is calculated with the law of propagation of variances, including the covariance terms. - Highlights: • The program “QUENCH” is devoted to the interpolation of quenching curves in LSC. • Functions are fitted to experimental data with uncertainties in both quenching and efficiency. • The parameters of the fitting function and the associated covariance matrix are evaluated. • The detection efficiency and uncertainty corresponding to a given quenching index is calculated.

  15. Beyond filtered backprojection: A reconstruction software package for ion beam microtomography data

    Habchi, C.; Gordillo, N.; Bourret, S.; Barberet, Ph.; Jovet, C.; Moretto, Ph.; Seznec, H.

    2013-01-01

    A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.

  16. The software package for solving problems of mathematical modeling of isothermal curing process

    S. G. Tikhomirov

    2016-01-01

    Full Text Available Summary. On the basis of the general laws of sulfur vulcanization diene rubbers the principles of the effective cross-linking using a multi-component agents was discussed. It is noted that the description of the mechanism of action of the complex cross-linking systems are complicated by the diversity of interactions of components and the influence of each of them on the curing kinetics, leading to a variety technological complications of real technology and affects on the quality and technical and economic indicators of the production of rubber goods. Based on the known theoretical approaches the system analysis of isothermal curing process was performed. It included the integration of different techniques and methods into a single set of. During the analysis of the kinetics of vulcanization it was found that the formation of the spatial grid parameters vulcanizates depend on many factors, to assess which requires special mathematical and algorithmic support. As a result of the stratification of the object were identified the following major subsystems. A software package for solving direct and inverse kinetic problems isothermal curing process was developed. Information support “Isothermal vulcanization” is a set of applications of mathematical modeling of isothermal curing. It is intended for direct and inverse kinetic problems. When solving the problem of clarifying the general scheme of chemical transformations used universal mechanism including secondary chemical reactions. Functional minimization algorithm with constraints on the unknown parameters was used for solving the inverse kinetic problem. Shows a flowchart of the program. An example of solving the inverse kinetic problem with the program was introduced. Dataware was implemented in the programming language C ++. Universal dependence to determine the initial concentration of the curing agent was applied . It allowing the use of a model with different properties of multicomponent

  17. MOlecular MAterials Property Prediction Package (MOMAP) 1.0: a software package for predicting the luminescent properties and mobility of organic functional materials

    Niu, Yingli; Li, Wenqiang; Peng, Qian; Geng, Hua; Yi, Yuanping; Wang, Linjun; Nan, Guangjun; Wang, Dong; Shuai, Zhigang

    2018-04-01

    MOlecular MAterials Property Prediction Package (MOMAP) is a software toolkit for molecular materials property prediction. It focuses on luminescent properties and charge mobility properties. This article contains a brief descriptive introduction of key features, theoretical models and algorithms of the software, together with examples that illustrate the performance. First, we present the theoretical models and algorithms for molecular luminescent properties calculation, which includes the excited-state radiative/non-radiative decay rate constant and the optical spectra. Then, a multi-scale simulation approach and its algorithm for the molecular charge mobility are described. This approach is based on hopping model and combines with Kinetic Monte Carlo and molecular dynamics simulations, and it is especially applicable for describing a large category of organic semiconductors, whose inter-molecular electronic coupling is much smaller than intra-molecular charge reorganisation energy.

  18. A new version of Scilab software package for the study of dynamical systems

    Bordeianu, C. C.; Felea, D.; Beşliu, C.; Jipa, Al.; Grossu, I. V.

    2009-11-01

    This work presents a new version of a software package for the study of chaotic flows, maps and fractals [1]. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well-known examples are implemented, with the capability of the users inserting their own ODE or iterative equations. New version program summaryProgram title: Chaos v2.0 Catalogue identifier: AEAP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1275 No. of bytes in distributed program, including test data, etc.: 7135 Distribution format: tar.gz Programming language: Scilab 5.1.1. Scilab 5.1.1 should be installed before running the program. Information about the installation can be found at scilab.org/howto/install/windows" xlink:type="simple">http://wiki.scilab.org/howto/install/windows. Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 150 Megabytes Classification: 6.2 Catalogue identifier of previous version: AEAP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 788 Does the new version supersede the previous version?: Yes Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of

  19. SBIR PHASE I FINAL REPORT: Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    Brust, Frederick W. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Punch, Edward F. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Kurth, Elizabeth A. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Kennedy, James C. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States)

    2013-12-02

    fabrication costs. VFT currently is tied to a commercial solver which makes it prohibitively expensive for use by SMEs, as there is a significant licensing cost for the solver - over and above for the relatively minimal cost for VFT. Emc2 developed this software code over a number of years in close cooperation with CAT (Peoria, IL), who currently uses this code exclusively for worldwide fabrication, product design and development activities. The use of VFT has allowed CAT to move directly from design to product fabrication and helped eliminate (to a large extent) new product prototyping and subsequent testing. Additionally, CAT has been able to eliminate/reduce costly one-of-a-kind appliances used to reduce distortion effects due to fabrication. In this context, SMEs can realize the same kind of improved product quality and reduced cost through adoption of the adapted version of VFT for design and subsequent manufacture of new products. Emc2's DOE SBIR Phase I effort successfully adapted VFT so that SMEs have access to this sophisticated and proven methodology that is quick, accurate and cost effective and available on-demand to address weld-simulation and fabrication problems prior to manufacture. The open source code, WARP3D, a high performance finite element code mainly used in fracture and damage assessment of structures, was modified so that computational weld problems can be solved efficiently on multiple processors and threads with VFT. The thermal solver for VFT, based on a series of closed form solution approximations, was enhanced for solution on multiple processors greatly increasing overall speed. In addition, the graphical user interface (GUI) has been tailored to integrate these solutions with WARP3D. The GUI is used to define all the weld pass descriptions, number of passes, material properties, consumable properties, weld speed, etc. for the structure to be modeled. The GUI was improved to make it user-friendly for engineers that are not experts in finite

  20. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction

    Jon Hill

    2014-03-01

    Full Text Available Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1 including new processing steps, such as Safe Taxonomic Reduction, 2 using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3 a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  1. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction.

    Hill, Jon; Davis, Katie E

    2014-01-01

    Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  2. EQ3/6, a software package for geochemical modeling of aqueous systems: Package overview and installation guide (Version 7.0)

    Wolery, T.J.

    1992-01-01

    EQ3/6 is a software package for geochemical modeling of aqueous systems. This report describes version 7.0. The major components of the package include: EQ3NR, a speciation-solubility code; EQ6, a reaction path code which models water/rock interaction or fluid mixing in either a pure reaction progress mode or a time mode; EQPT, a data file preprocessor, EQLIB, a supporting software library; and five supporting thermodynamic data files. The software deals with the concepts of thermodynamic equilibrium, thermodynamic disequilibrium, and reaction kinetics. The five supporting data files contain both standard state and activity coefficient-related data. Three support the use of the Davies or B equations for the activity coefficients; the other two support the use of Pitzer's equations. The temperature range of the thermodynamic data on the data files varies from 25 degree C only to 0--300 degree C. EQPT takes a formatted data file (a data0 file) and writes an unformatted near-equivalent called a datal file, which is actually the form read by EQ3NR and EQ6. EQ3NR is useful for analyzing groundwater chemistry data, calculating solubility limits, and determining whether certain reactions are in states of partial equilibrium or disequilibrium. It is also required to initialize an EQ6 calculation. EQ6 models the consequences of reacting an aqueous solution with a set of reactants which react irreversibly. It can also model fluid mixing and the consequences of changes in temperature. This code operates both in a pure reaction progress frame and in a time frame

  3. EQ3/6, a software package for geochemical modeling of aqueous systems: Package overview and installation guide (Version 7.0)

    Wolery, T.J.

    1992-09-14

    EQ3/6 is a software package for geochemical modeling of aqueous systems. This report describes version 7.0. The major components of the package include: EQ3NR, a speciation-solubility code; EQ6, a reaction path code which models water/rock interaction or fluid mixing in either a pure reaction progress mode or a time mode; EQPT, a data file preprocessor, EQLIB, a supporting software library; and five supporting thermodynamic data files. The software deals with the concepts of thermodynamic equilibrium, thermodynamic disequilibrium, and reaction kinetics. The five supporting data files contain both standard state and activity coefficient-related data. Three support the use of the Davies or B-dot equations for the activity coefficients; the other two support the use of Pitzer`s equations. The temperature range of the thermodynamic data on the data files varies from 25{degree}C only to 0--300{degree}C. EQPT takes a formatted data file (a data0 file) and writes an unformatted near-equivalent called a datal file, which is actually the form read by EQ3NR and EQ6. EQ3NR is useful for analyzing groundwater chemistry data, calculating solubility limits, and determining whether certain reactions are in states of partial equilibrium or disequilibrium. It is also required to initialize an EQ6 calculation. EQ6 models the consequences of reacting an aqueous solution with a set of reactants which react irreversibly. It can also model fluid mixing and the consequences of changes in temperature. This code operates both in a pure reaction progress frame and in a time frame.

  4. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Ashraf, Haseem; de Hoop, B; Shaker, S B

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms.......We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms....

  5. Parallelization of an existing high energy physics event reconstruction software package

    Schiefer, R.; Francis, D.

    1996-01-01

    Software parallelization allows an efficient use of available computing power to increase the performance of applications. In a case study the authors have investigated the parallelization of high energy physics event reconstruction software in terms of costs (effort, computing resource requirements), benefits (performance increase) and the feasibility of a systematic parallelization approach. Guidelines facilitating a parallel implementation are proposed for future software development

  6. Quantitative comparison and evaluation of two commercially available, two-dimensional electrophoresis image analysis software packages, Z3 and Melanie.

    Raman, Babu; Cheung, Agnes; Marten, Mark R

    2002-07-01

    While a variety of software packages are available for analyzing two-dimensional electrophoresis (2-DE) gel images, no comparisons between these packages have been published, making it difficult for end users to determine which package would best meet their needs. The goal here was to develop a set of tests to quantitatively evaluate and then compare two software packages, Melanie 3.0 and Z3, in three of the fundamental steps involved in 2-DE image analysis: (i) spot detection, (ii) gel matching, and (iii) spot quantitation. To test spot detection capability, automatically detected protein spots were compared to manually counted, "real" protein spots. Spot matching efficiency was determined by comparing distorted (both geometrically and nongeometrically) gel images with undistorted original images, and quantitation tests were performed on artificial gels with spots of varying Gaussian volumes. In spot detection tests, Z3 performed better than Melanie 3.0 and required minimal user intervention to detect approximately 89% of the actual protein spots and relatively few extraneous spots. Results from gel matching tests depended on the type of image distortion used. For geometric distortions, Z3 performed better than Melanie 3.0, matching 99% of the spots, even for extreme distortions. For nongeometrical distortions, both Z3 and Melanie 3.0 required user intervention and performed comparably, matching 95% of the spots. In spot quantitation tests, both Z3 and Melanie 3.0 predicted spot volumes relatively well for spot ratios less than 1:6. For higher ratios, Melanie 3.0 did much better. In summary, results suggest Z3 requires less user intervention than Melanie 3.0, thus simplifying differential comparison of 2-DE gel images. Melanie 3.0, however, offers many more optional tools for image editing, spot detection, data reporting and statistical analysis than Z3. All image files used for these tests and updated information on the software are available on the internet

  7. The high performance cluster computing system for BES offline data analysis

    Sun Yongzhao; Xu Dong; Zhang Shaoqiang; Yang Ting

    2004-01-01

    A high performance cluster computing system (EPCfarm) is introduced, which used for BES offline data analysis. The setup and the characteristics of the hardware and software of EPCfarm are described. The PBS, a queue management package, and the performance of EPCfarm is presented also. (authors)

  8. Development and Evaluation of an Open-Source Software Package “CGITA” for Quantifying Tumor Heterogeneity with Molecular Images

    Yu-Hua Dean Fang

    2014-01-01

    Full Text Available Background. The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA toolbox, and provide it to the research community as a free, open-source project. Methods. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC cancer patients treated with definitive radiotherapies. Results. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC. Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.. Conclusions. CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.

  9. Dose - a software package for the calculation of integrated exposure resulting from an accident in a nuclear power plant

    Doron, E.; Ohaion, H.; Asculai, E.

    1985-05-01

    A software package intended for the assessments of risks resulting from accidental release of radioactive materials from a nuclear power plant is presented. The models and the various programs based on them, are described. The work includes detailed operating instructions for the various programs, as well as instructions for the preparation of the necessary input data. Various options are described for additions and changes to the programs with the aim of extending their usefulness to more general cases from the aspects of meteorology and pollution sources. finally, a sample calculation that enables the user to test the proper functioning of the whole package, as well as his own proficiency in its use, is given. (author)

  10. SIMSAS - a window based software package for simulation and analysis of multiple small-angle scattering data

    Jayaswal, B.; Mazumder, S.

    1998-09-01

    Small-angle scattering data from strong scattering systems, e.g. porous materials, cannot be analysed invoking single scattering approximation as specimen needed to replicate the bulk matrix in essential properties are too thick to validate the approximation. The presence of multiple scattering is indicated by invalidity of the functional invariance property of the observed scattering profile with variation of sample thickness and/or wave length of the probing radiation. This article delineates how non accounting of multiple scattering affects the results of analysis and then how to correct the data for its effect. It deals with an algorithm to extract single scattering profile from small-angle scattering data affected by multiple scattering. The algorithm can process the scattering data and deduce single scattering profile in absolute scale. A software package, SIMSAS, is introduced for executing this inversion step. This package is useful both to simulate and to analyse multiple small-angle scattering data. (author)

  11. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  12. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V ampersand V guideline packages and procedures. Volume 5

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V ampersand V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, open-quotes User's Manual.close quotes Three factors determine what V ampersand V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V Guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems

  13. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    Pascoal, A; Lawinski, C P; Honey, I; Blake, P

    2005-01-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA detector , which indicates the potential to use the software CDRAD analyser for assessment of relative IQ

  14. Analysing the Zenith Tropospheric Delay Estimates in On-line Precise Point Positioning (PPP) Services and PPP Software Packages.

    Mendez Astudillo, Jorge; Lau, Lawrence; Tang, Yu-Ting; Moore, Terry

    2018-02-14

    As Global Navigation Satellite System (GNSS) signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP) technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD) is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS) tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm) with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.

  15. Analysing the Zenith Tropospheric Delay Estimates in On-line Precise Point Positioning (PPP Services and PPP Software Packages

    Jorge Mendez Astudillo

    2018-02-01

    Full Text Available As Global Navigation Satellite System (GNSS signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.

  16. Company's Unusual Plan to Package Commercial Software with Business Textbooks Produces a Measure of Success.

    Watkins, Beverly T.

    1992-01-01

    Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)

  17. Groundwater movement simulation by the software package PM5 for the Sviyaga river adjoining territory in the Republic of Tatarstan

    Kosterina, E. A.; Isagadzhieva, Z. Sh

    2018-01-01

    Data of the ecological-hydrogeological fieldwork at the Predvolzhye region of the Republic of Tatarstan were analyzed. A geofiltration model of the Buinsk region area near the village of Stary Studenets in the territory of the Republic of Tatarstan was constructed by the PM5 software package. The model can be developed to become the basis for estimation of the groundwater reserves of the territory, modeling the operation of water intake wells, designing the location of water intake wells, and evaluation of their operational capabilities, and constructing sanitary protection zones.

  18. RavenDB high performance

    Ritchie, Brian

    2013-01-01

    RavenDB High Performance is comprehensive yet concise tutorial that developers can use to.This book is for developers & software architects who are designing systems in order to achieve high performance right from the start. A basic understanding of RavenDB is recommended, but not required. While the book focuses on advanced topics, it does not assume that the reader has a great deal of prior knowledge of working with RavenDB.

  19. Evaluation of three state-of-the-art metabolite prediction software packages (Meteor, MetaSite, and StarDrop) through independent and synergistic use.

    T'jollyn, H; Boussery, K; Mortishire-Smith, R J; Coe, K; De Boeck, B; Van Bocxlaer, J F; Mannens, G

    2011-11-01

    The aim of this study was to evaluate three different metabolite prediction software packages (Meteor, MetaSite, and StarDrop) with respect to their ability to predict loci of metabolism and suggest relative proportions of metabolites. A chemically diverse test set of 22 compounds, for which in vivo human mass balance studies and metabolic schemes were available, was used as basis for the evaluation. Each software package was provided with structures of the parent compounds, and predicted metabolites were compared with experimentally determined human metabolites. The evaluation consisted of two parts. First, different settings within each software package were investigated and the software was evaluated using those settings determined to give the best prediction. Second, the three different packages were combined using the optimized settings to see whether a synergistic effect concerning the overall metabolism prediction could be established. The performance of the software was scored for both sensitivity and precision, taking into account the capabilities/limitations of the particular software. Varying results were obtained for the individual packages. Meteor showed a general tendency toward overprediction, and this led to a relatively low precision (∼35%) but high sensitivity (∼70%). MetaSite and StarDrop both exhibited a sensitivity and precision of ∼50%. By combining predictions obtained with the different packages, we found that increased precision can be obtained. We conclude that the state-of-the-art individual metabolite prediction software has many advantageous features but needs refinement to obtain acceptable prediction profiles. Synergistic use of different software packages could prove useful.

  20. The arison data acquisition and elaboration software package running on Hewlett-Packard minicomputer

    Diamantidis, Z.

    1987-01-01

    In this article, a data acquisition and elaboration system is described consisting in a PCM data acquisition and a spectrum analyser system and their data elaboration package for reactor safety or other general purposes. Measurements on temperature, noise fluctuations of temperature or other noise analysis dynamics, pressure, etc. Time series and their conversion in engineering units and their statistical and frequency analysis is provided

  1. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All

  2. The last developments of the airGR R-package, an open source software for rainfall-runoff modelling

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Perrin, Charles; Andréassian, Vazken

    2017-04-01

    and usability of this tool. References Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  3. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  4. The consequences of a new software package for the quantification of gated-SPECT myocardial perfusion studies

    Veen, Berlinda J. van der; Dibbets-Schneider, Petra; Stokkel, Marcel P.M.; Scholte, Arthur J.

    2010-01-01

    Semiquantitative analysis of myocardial perfusion scintigraphy (MPS) has reduced inter- and intraobserver variability, and enables researchers to compare parameters in the same patient over time, or between groups of patients. There are several software packages available that are designed to process MPS data and quantify parameters. In this study the performances of two systems, quantitative gated SPECT (QGS) and 4D-MSPECT, in the processing of clinical patient data and phantom data were compared. The clinical MPS data of 148 consecutive patients were analysed using QGS and 4D-MSPECT to determine the end-diastolic volume, end-systolic volume and left ventricular ejection fraction. Patients were divided into groups based on gender, body mass index, heart size, stressor type and defect type. The AGATE dynamic heart phantom was used to provide reference values for the left ventricular ejection fraction. Although the correlations were excellent (correlation coefficients 0.886 to 0.980) for all parameters, significant differences (p < 0.001) were found between the systems. Bland-Altman plots indicated that 4D-MSPECT provided overall higher values of all parameters than QGS. These differences between the systems were not significant in patients with a small heart (end-diastolic volume <70 ml). Other clinical factors had no direct influence on the relationship. Additionally, the phantom data indicated good linear responses of both systems. The discrepancies between these software packages were clinically relevant, and influenced by heart size. The possibility of such discrepancies should be taken into account when a new quantitative software system is introduced, or when multiple software systems are used in the same institution. (orig.)

  5. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  6. A software package for predicting design-flood hydrographs in small and ungauged basins

    Rodolfo Piscopia; Andrea Petroselli; Salvatore Grimaldi

    2015-01-01

    In this study, software for estimating design hydrographs in small and ungauged basins is presented. The main aim is to propose a fast and user-friendly empirical tool that the practitioner can apply for hydrological studies characterised by a lack of observed data. The software implements a homonymous framework called event-based approach for small and ungauged basins (EBA4SUB) that was recently developed and tested by the authors to estimate the design peak discharge using the same input in...

  7. High performance data transfer

    Cottrell, R.; Fang, C.; Hanushevsky, A.; Kreuger, W.; Yang, W.

    2017-10-01

    The exponentially increasing need for high speed data transfer is driven by big data, and cloud computing together with the needs of data intensive science, High Performance Computing (HPC), defense, the oil and gas industry etc. We report on the Zettar ZX software. This has been developed since 2013 to meet these growing needs by providing high performance data transfer and encryption in a scalable, balanced, easy to deploy and use way while minimizing power and space utilization. In collaboration with several commercial vendors, Proofs of Concept (PoC) consisting of clusters have been put together using off-the- shelf components to test the ZX scalability and ability to balance services using multiple cores, and links. The PoCs are based on SSD flash storage that is managed by a parallel file system. Each cluster occupies 4 rack units. Using the PoCs, between clusters we have achieved almost 200Gbps memory to memory over two 100Gbps links, and 70Gbps parallel file to parallel file with encryption over a 5000 mile 100Gbps link.

  8. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  9. PmagPy: Software Package for Paleomagnetic Data Analysis and Gateway to the Magnetics Information Consortium (MagIC) Database

    Jonestrask, L.; Tauxe, L.; Shaar, R.; Jarboe, N.; Minnett, R.; Koppers, A. A. P.

    2014-12-01

    There are many data types and methods of analysis in rock and paleomagnetic investigations. The MagIC database (http://earthref.org/MAGIC) was designed to accommodate the vast majority of data used in such investigations. Yet getting data from the laboratory into the database, and visualizing and re-analyzing data downloaded from the database, makes special demands on data formatting. There are several recently published programming packages that deal with single types of data: demagnetization experiments (e.g., Lurcock et al., 2012), paleointensity experiments (e.g., Leonhardt et al., 2004), and FORC diagrams (e.g., Harrison et al., 2008). However, there is a need for a unified set of open source, cross-platform software that deals with the great variety of data types in a consistent way and facilitates importing data into the MagIC format, analyzing them and uploading them into the MagIC database. The PmagPy software package (http://earthref.org/PmagPy/cookbook/) comprises a such a comprehensive set of tools. It facilitates conversion of many laboratory formats into the common MagIC format and allows interpretation of demagnetization and Thellier-type experimental data. With some 175 programs and over 250 functions, it can be used to create a wide variety of plots and allows manipulation of downloaded data sets as well as preparation of new contributions for uploading to the MagIC database.

  10. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  11. Identifying High Performance ERP Projects

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  12. Particle Data Management Software for 3DParticle Tracking Velocimetry and Related Applications – The Flowtracks Package

    Yosef Meller

    2016-06-01

    Full Text Available The Particle Tracking Velocimetry (PTV community employs several formats of particle information such as position and velocity as function of time, i.e. trajectory data, as a result of diverging needs unmet by existing formats, and a number of different, mostly home-grown, codes for handling the data. Flowtracks is a Python package that provides a single code base for accessing different formats as a database, i.e. storing data and programmatically manipulating them using format-agnostic data structures. Furthermore, it offers an HDF5-based format that is fast and extensible, obviating the need for other formats. The package may be obtained from https://github.com/OpenPTV/postptv and used as-is by many fluid-dynamics labs, or with minor extensions adhering to a common interface, by researchers from other fields, such as biology and population tracking.

  13. An analysis of distribution transformer failure using the statistical package for the social sciences (SPSS software

    María Gabriela Mago Ramos

    2012-05-01

    Full Text Available A methodology was developed for analysing faults in distribution transformers using the statistical package for social sciences (SPSS; it consisted of organising and creating of database regarding failed equipment, incorporating such data into the processing programme and converting all the information into numerical variables to be processed, thereby obtaining descriptive statistics and enabling factor and discriminant analysis. The research was based on information provided by companies in areas served by Corpoelec (Valencia, Venezuela and Codensa (Bogotá, Colombia.

  14. GERMINATOR: a software package for high-throughput scoring and curve fitting of Arabidopsis seed germination.

    Joosen, Ronny V L; Kodde, Jan; Willems, Leo A J; Ligterink, Wilco; van der Plas, Linus H W; Hilhorst, Henk W M

    2010-04-01

    Over the past few decades seed physiology research has contributed to many important scientific discoveries and has provided valuable tools for the production of high quality seeds. An important instrument for this type of research is the accurate quantification of germination; however gathering cumulative germination data is a very laborious task that is often prohibitive to the execution of large experiments. In this paper we present the germinator package: a simple, highly cost-efficient and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The germinator package contains three modules: (i) design of experimental setup with various options to replicate and randomize samples; (ii) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (iii) curve fitting of cumulative germination data and the extraction, recap and visualization of the various germination parameters. The curve-fitting module enables analysis of general cumulative germination data and can be used for all plant species. We show that the automatic scoring system works for Arabidopsis thaliana and Brassica spp. seeds, but is likely to be applicable to other species, as well. In this paper we show the accuracy, reproducibility and flexibility of the germinator package. We have successfully applied it to evaluate natural variation for salt tolerance in a large population of recombinant inbred lines and were able to identify several quantitative trait loci for salt tolerance. Germinator is a low-cost package that allows the monitoring of several thousands of germination tests, several times a day by a single person.

  15. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  16. Area of ischemia assessed by physicians and software packages from myocardial perfusion scintigrams

    Edenbrandt, L.; Hoglund, P.; Frantz, S.

    2014-01-01

    medicine delineated the extent of the ischemic defects. After at least two weeks, they delineated the defects again, and were this time provided a suggestion of the defect delineation by EXINI Heart(TM) (EXINI). Summed difference scores and ischemic extent values were obtained from four software programs......Background: The European Society of Cardiology recommends that patients with > 10% area of ischemia should receive revascularization. We investigated inter-observer variability for the extent of ischemic defects reported by different physicians and by different software tools, and if inter....... Results: The median extent values obtained from the 11 physicians varied between 8% and 34%, and between 9% and 16% for the software programs. For all 25 patients, mean extent obtained from EXINI was 17.0% (+/- standard deviation (SD) 14.6%). Mean extent for physicians was 22.6% (+/- 15.6%) for the first...

  17. Counting radon tracks in Makrofol detectors with the 'image reduction and analysis facility' (IRAF) software package

    Hernandez, F. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain)]. E-mail: fimerall@ull.es; Gonzalez-Manrique, S. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Karlsson, L. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Hernandez-Armas, J. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Aparicio, A. [Instituto de Astrofisica de Canarias, 38200 La Laguna, Tenerife (Spain); Departamento de Astrofisica, Universidad de La Laguna. Avenida. Astrofisico Francisco Sanchez s/n, 38071 La Laguna, Tenerife (Spain)

    2007-03-15

    Makrofol detectors are commonly used for long-term radon ({sup 222}Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm{sup -3}htrack{sup -1}cm{sup 2}, has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm{sup -3} and, in two of them, above 400Bqm{sup -3}. Further studies should be performed at those schools following the European Union recommendations about radon concentrations in

  18. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  19. A software package for predicting design-flood hydrographs in small and ungauged basins

    Rodolfo Piscopia

    2015-06-01

    Full Text Available In this study, software for estimating design hydrographs in small and ungauged basins is presented. The main aim is to propose a fast and user-friendly empirical tool that the practitioner can apply for hydrological studies characterised by a lack of observed data. The software implements a homonymous framework called event-based approach for small and ungauged basins (EBA4SUB that was recently developed and tested by the authors to estimate the design peak discharge using the same input information necessary to apply the rational formula. EBA4SUB is a classical hydrological event-based model in which each step (design hyetograph, net rainfall estimation, and rainfall-runoff transformation is appropriately adapted for empirical applications without calibration. As a case study, the software is applied in a small watershed while varying the hyetograph shape, rainfall peak position, and return time. The results provide an overview of the software and confirm the secondary role of the design rainfall peak position.

  20. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably.

    Ashraf, H.; Hoop, B.J. de; Shaker, S.B.; Dirksen, A.; Bach, K.S.; Hansen, H.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    OBJECTIVE: We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. METHODS: In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were

  1. A software package for patient-specific dosimetry in the locoregional RIT of gliomas using 188Re labelled NIMOTUZUMAB

    Torres, L.A.; Coca, M.A.; Sanchez, Y.; Cornejo, N.; Catasus, C.; Denaro, M. de

    2008-01-01

    Full text: The locoregional treatment of high-grade gliomas using beta emitter compounds allows delivering high radiation doses in the tumor bed and the brain adjacent tissues of patients suffering these aggressive malignancies. The main goal of this work was to implement patient-specific dosimetry procedures using a voxel-based methodology in order to compute and analyze the three-dimensional doses distributions received by the patients undergoing loco-regional treatment of gliomas with the 188 Re labeled MAb NIMOTUZUMAB. A software package called TRIDOSE has been developed to perform the image managing, volume registration, dose calculations and qualitative and quantitative analysis of the results, including dose-volume histograms and isodose curves. The dosimetric factors at voxel level for 188 Re ('S' values) were estimated using two different methods, Monte Carlo simulations of energy transport and deposition and the integration of the dose kernel functions. A quality control module was also implemented in order to test the software using well-known 3D distribution of activities or counts. The TRIDOSE outputs were compared with other commercial software showing relative differences lower than 1.10% for different sphere sizes. The established dosimetric procedures constitute a useful tool to compute the absorbed doses received by patients undergoing radioimmunotherapy of brain tumors with 188 Re-NIMOTUZUMAB. (author)

  2. TENSOLVE: A software package for solving systems of nonlinear equations and nonlinear least squares problems using tensor methods

    Bouaricha, A. [Argonne National Lab., IL (United States). Mathematics and Computer Science Div.; Schnabel, R.B. [Colorado Univ., Boulder, CO (United States). Dept. of Computer Science

    1996-12-31

    This paper describes a modular software package for solving systems of nonlinear equations and nonlinear least squares problems, using a new class of methods called tensor methods. It is intended for small to medium-sized problems, say with up to 100 equations and unknowns, in cases where it is reasonable to calculate the Jacobian matrix or approximate it by finite differences at each iteration. The software allows the user to select between a tensor method and a standard method based upon a linear model. The tensor method models F({ital x}) by a quadratic model, where the second-order term is chosen so that the model is hardly more expensive to form, store, or solve than the standard linear model. Moreover, the software provides two different global strategies, a line search and a two- dimensional trust region approach. Test results indicate that, in general, tensor methods are significantly more efficient and robust than standard methods on small and medium-sized problems in iterations and function evaluations.

  3. DRUGDOG 3:0: U.S. Navy Random Urinalysis software package

    Wilson, Dale E.

    1994-01-01

    Approved for public release; distribution is unlimited Although the United States Navy has had a mandatory Random Urinalysis Program in effect for many years, there has never been a formal, standardize methodology to implement the process. OPNAV INSTRUCTION 5350.4 (series) provides guidance on what must be accomplished, but not how to accomplish it. Automation and standardization of the program through software implementation can lend confidence to personnel who undergo urinalysis testing ...

  4. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  5. The IPNS rietveld analysis software package for TOF [time-of-flight] powder diffraction data: Recent developments

    Rotella, F.J.; Richardson, J.W. Jr.

    1987-01-01

    A system of FORTRAN programs for the analysis of time-of-flight (TOF) neutron powder diffraction data via the Rietveld method at IPNS has been modified recently, making it possible to analyze data that exhibit diffraction maxima broadened due to anisotropic strain and that can be modeled by individual atomic anharmonic thermal vibrations. The observation of noncrystalline scattering in data from some powder samples has led to the development of software to fit such scattering by a function related to a radial distribution function through Fourier-filtering techniques. The ''user friendliness'' of the IPNS Rietveld package has been enhanced by the development of ''RIETVELD,'' a menu-based VAX/VMS command language routine for interactive file manipulation and program execution

  6. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  7. “DETECTION ARTIFACTS” SOFTWARE PACKAGE: FUNCTIONAL CAPABILITIES AND PROSPECTS OF USING (ON THE EXAMPLE OF GEOARCHEOLOGICAL RESEARCH

    Ye. P. Krupochkin

    2017-01-01

    Full Text Available Mathematical and scientific methods are highly significant in modern geoarcheological study. They contribute to the development of new computer technologies and their implementing in geoarcheological research in particular, decoding and photogrammetric processing of space images.The article focuses on the “Detection Artifacts”software package designed for thematic aerospace image decoding which is aimed at making the search automatic for various archeological sites, both natural and artificially created ones. The main attention is drawn to decoding of archeological sites using methods of morphological analysis and indicative decoding.Its work is based on two groups of methods of image computer processing: 1 an image enhancement method which is carried out with the help of spatial frequency filtration, and 2 a method of morphometric analysis. The methods of spatial frequency filtration can be used to solve two problems: information noise minimization and edge enhancement. To achieve the best results using the methods of spatial frequency filtration it is necessary to have all the information of relevance to the objects of searching.Searching for various archeological sites is not only photogrammetric task. In fact, this problem can be solved in the sphere of photogrammetry with the application of aerospace and computer methods. The authors stress the idea in order to avoid terminology ambiguity and confusion when describing the essence of the methods and processes. It should be noted that the work with the images must be executed in a strict sequence. First and foremost, photogrammetric processing – atmospheric correction, geometric adjustment, conversion and geo targeting should be implemented. And only after that one can proceed to decoding the information.When creating the software package a modular structure was applied that favorably affected the tasks being solved and corresponded to the conception of search for archaeological objects

  8. MedXViewer: an extensible web-enabled software package for medical imaging

    Looney, P. T.; Young, K. C.; Mackenzie, Alistair; Halling-Brown, Mark D.

    2014-03-01

    MedXViewer (Medical eXtensible Viewer) is an application designed to allow workstation-independent, PACS-less viewing and interaction with anonymised medical images (e.g. observer studies). The application was initially implemented for use in digital mammography and tomosynthesis but the flexible software design allows it to be easily extended to other imaging modalities. Regions of interest can be identified by a user and any associated information about a mark, an image or a study can be added. The questions and settings can be easily configured depending on the need of the research allowing both ROC and FROC studies to be performed. The extensible nature of the design allows for other functionality and hanging protocols to be available for each study. Panning, windowing, zooming and moving through slices are all available while modality-specific features can be easily enabled e.g. quadrant zooming in mammographic studies. MedXViewer can integrate with a web-based image database allowing results and images to be stored centrally. The software and images can be downloaded remotely from this centralised data-store. Alternatively, the software can run without a network connection where the images and results can be encrypted and stored locally on a machine or external drive. Due to the advanced workstation-style functionality, the simple deployment on heterogeneous systems over the internet without a requirement for administrative access and the ability to utilise a centralised database, MedXViewer has been used for running remote paper-less observer studies and is capable of providing a training infrastructure and co-ordinating remote collaborative viewing sessions (e.g. cancer reviews, interesting cases).

  9. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR

    Tejero, Roberto; Snyder, David; Mao, Binchen; Aramini, James M.; Montelione, Gaetano T.

    2013-01-01

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data

  10. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR

    Tejero, Roberto [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States); Snyder, David [William Paterson University, Department of Chemistry (United States); Mao, Binchen; Aramini, James M.; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States)

    2013-08-15

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data.

  11. Informed-Proteomics: open-source software package for top-down proteomics

    Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher; Zhou, Mowei; Mendoza, Joshua; Fujimoto, Grant M.; Gibbons, Bryson C.; Shaw, Jared B.; Shen, Yufeng; Shukla, Anil K.; Moore, Ronald J.; Liu, Tao; Petyuk, Vladislav A.; Tolić, Nikola; Paša-Tolić, Ljiljana; Smith, Richard D.; Payne, Samuel H.; Kim, Sangtae

    2017-08-07

    Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent need to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.

  12. SpcAudace: Spectroscopic processing and analysis package of Audela software

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  13. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Philipp Thomas

    Full Text Available The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA, which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network

  14. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  15. Constraint Network Analysis (CNA): a Python software package for efficiently linking biomacromolecular structure, flexibility, (thermo-)stability, and function.

    Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger

    2013-04-22

    For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.

  16. Verification and validation of the SAPHIRE Version 4.0 PRA software package

    Bolander, T.W.; Calley, M.B.; Capps, E.L.

    1994-02-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE). SAPHIRE is a set of four computer programs that the Nuclear Regulatory Commission (NRC) developed to perform probabilistic risk assessments (PRAs). These programs allow an analyst to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs included in this set are Integrated Reliability and Risk Analysis System (IRRAS), System Analysis and Risk Assessment (SARA), Models and Results Database (MAR-D), and Fault Tree/Event Tree/Piping and Instrumentation Diagram (FEP) graphical editor. The V ampersand V steps included a V ampersand V plan to describe the process and criteria by which the V ampersand V would be performed; a software requirements documentation review to determine the correctness, completeness, and traceability of the requirements; a user survey to determine the usefulness of the user documentation, identification and testing of vital and non-vital features, and documentation of the test results

  17. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  18. Suitability of semi-automated tumor response assessment of liver metastases using a dedicated software package

    Kalkmann, Janine; Ladd, S.C.; Greiff, A. de; Forsting, M.; Stattaus, J.

    2010-01-01

    Purpose: to evaluate the suitability of semi-automated compared to manual tumor response assessment (TRA) of liver metastases. Materials and methods: in total, 32 patients with colorectal cancer and liver metastases were followed by an average of 2.8 contrast-enhanced CT scans. Two observers (O1, O2) measured the longest diameter (LD) of 269 liver metastases manually and semi-automatically using software installed as thin-client on a PACS workstation (LMS-Liver, MEDIAN Technologies). LD and TRA (''progressive'', ''stable'', ''partial remission'') were performed according to RECIST (Response Evaluation Criteria in Solid Tumors) and analyzed for between-method, interobserver and intraobserver variability. The time needed for evaluation was compared for both methods. Results: all measurements correlated excellently (r ≥ 0.96). Intraobserver (semi-automated), interobserver (manual) and between-method differences (by O1) in LD of 1.4 ± 2.6 mm, 1.9 ± 1.9 mm and 2.1 ± 2.0 mm, respectively, were not significant. Interobserver (semi-automated) and between-method (by O2) differences in LD of 3.0 ± 3.0 mm and 2.6 ± 2.0 mm, respectively, reflected a significant variability (p < 0.01). The interobserver agreement in manual and semi-automated TRA was 91.4%. The intraobserver agreement in semi-automated TRA was 84.5%. Between both methods a TRA agreement of 86.2% was obtained. Semi-automated evaluation (2.7 min) took slightly more time than manual evaluation (2.3 min). Conclusion: semi-automated and manual evaluation of liver metastases yield comparable results in response assessments and require comparable effort. (orig.)

  19. High performance homes

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    Can prefabrication contribute to the development of high performance homes? To answer this question, this chapter defines high performance in more broadly inclusive terms, acknowledging the technical, architectural, social and economic conditions under which energy consumption and production occur....... Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  20. Software

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  1. Operation manual for EDXRDDA - a software package for Bragg peak analysis of energy dispersive powder X-ray diffraction data

    Jayaswal, Balhans; Vijaykumar, V.; Momin, S.N.; Sikka, S.K.

    1992-01-01

    EDXRDDA is a software package for analysis of raw data for energy dispersive x-ray diffraction from powder samples. It resolves the spectra into individual peaks by a constrained non-linear least squares method (Hughes and Sexton, 1988). The profile function adopted is the Gaussian/Lorentzian product with the mixing ratio refinable in the program. The program is implemented on an IBM PC and is highly interactive with extensive plotting facilities. This report is a user's guide for running the program. In the first step after inputting the spectra, the full spectra is plotted on the screen. The user then chooses a portion of this for peak resolution. The initial guess for the peak intensity, peak position are input with the help of a cursor or a mouse. Upto twenty peaks can be fitted at a time in an interval of 500 channels. For overlapping peaks, various constraints can be applied. Bragg peaks and fluorescence peaks with different half widths can be handled simultaneously. The program on execution produces a look up table which contains the refined values of the peak position, half width, peak intensity, integrated intensity, and their error estimates of each peak. The program is very general and can also be used for curve fitting of data from many other experiments. (author). 2 refs., 7 figs., 2 appendices

  2. SISYPHUS: A high performance seismic inversion factory

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with

  3. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  4. High Performance Marine Vessels

    Yun, Liang

    2012-01-01

    High Performance Marine Vessels (HPMVs) range from the Fast Ferries to the latest high speed Navy Craft, including competition power boats and hydroplanes, hydrofoils, hovercraft, catamarans and other multi-hull craft. High Performance Marine Vessels covers the main concepts of HPMVs and discusses historical background, design features, services that have been successful and not so successful, and some sample data of the range of HPMVs to date. Included is a comparison of all HPMVs craft and the differences between them and descriptions of performance (hydrodynamics and aerodynamics). Readers will find a comprehensive overview of the design, development and building of HPMVs. In summary, this book: Focuses on technology at the aero-marine interface Covers the full range of high performance marine vessel concepts Explains the historical development of various HPMVs Discusses ferries, racing and pleasure craft, as well as utility and military missions High Performance Marine Vessels is an ideal book for student...

  5. High Performance Macromolecular Material

    Forest, M

    2002-01-01

    .... In essence, most commercial high-performance polymers are processed through fiber spinning, following Nature and spider silk, which is still pound-for-pound the toughest liquid crystalline polymer...

  6. Automatic Energy Schemes for High Performance Applications

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  7. Accuracy evaluation of fusion of CT, MR, and SPECT images using commercially available software packages (SRS PLATO and IFS)

    Mongioj, Valeria; Brusa, Anna; Loi, Gianfranco; Pignoli, Emanuele; Gramaglia, Alberto; Scorsetti, Marta; Bombardieri, Emilio; Marchesini, Renato

    1999-01-01

    Purpose: A problem for clinicians is to mentally integrate information from multiple diagnostic sources, such as computed tomography (CT), magnetic resonance (MR), and single photon emission computed tomography (SPECT), whose images give anatomic and metabolic information. Methods and Materials: To combine this different imaging procedure information, and to overlay correspondent slices, we used commercially available software packages (SRS PLATO and IFS). The algorithms utilize a fiducial-based coordinate system (or frame) with 3 N-shaped markers, which allows coordinate transformation of a clinical examination data set (9 spots for each transaxial section) to a stereotactic coordinate system. The N-shaped markers were filled with fluids visible in each modality (gadolinium for MR, calcium chloride for CT, and 99m Tc for SPECT). The frame is relocatable, in the different acquisition modalities, by means of a head holder to which a face mask is fixed so as to immobilize the patient. Position errors due to the algorithms were obtained by evaluating the stereotactic coordinates of five sources detectable in each modality. Results: SPECT and MR position errors due to the algorithms were evaluated with respect to CT: Δx was ≤ 0.9 mm for MR and ≤ 1.4 mm for SPECT, Δy was ≤ 1 mm and ≤ 3 mm for MR and SPECT, respectively. Maximal differences in distance between estimated and actual fiducial centers (geometric mismatch) were in the order of the pixel size (0.8 mm for CT, 1.4 mm for MR, and 1.8 mm for SPECT). In an attempt to distinguish necrosis from residual disease, the image fusion protocol was studied in 35 primary or metastatic brain tumor patients. Conclusions: The image fusion technique has a good degree of accuracy as well as the potential to improve the specificity of tissue identification and the precision of the subsequent treatment planning

  8. An evaluation of the psychometric properties of the Purdue Pharmacist Directive Guidance Scale using SPSS and R software packages.

    Marr-Lyon, Lisa R; Gupchup, Gireesh V; Anderson, Joe R

    2012-01-01

    The Purdue Pharmacist Directive Guidance (PPDG) Scale was developed to assess patients' perceptions of the level of pharmacist-provided (1) instruction and (2) feedback and goal-setting-2 aspects of pharmaceutical care. Calculations of its psychometric properties stemming from SPSS and R were similar, but distinct differences were apparent. Using SPSS and R software packages, researchers aimed to examine the construct validity of the PPDG using a higher order factoring procedure; in tandem, McDonald's omega and Cronbach's alpha were calculated as means of reliability analyses. Ninety-nine patients with either type I or type II diabetes, aged 18 years or older, able to read and write English, and who could provide written-informed consent participated in the study. Data were collected in 8 community pharmacies in New Mexico. Using R, (1) a principal axis factor analysis with promax (oblique) rotation was conducted, (2) a Schmid-Leiman transformation was attained, and (3) McDonald's omega and Cronbach's alpha were computed. Using SPSS, subscale findings were validated by conducting a principal axis factor analysis with promax rotation; strict parallels and Cronbach's alpha reliabilities were calculated. McDonald's omega and Cronbach's alpha were robust, with coefficients greater than 0.90; principal axis factor analysis with promax rotation revealed construct similarities with an overall general factor emerging from R. Further subjecting the PPDG to rigorous psychometric testing revealed stronger quantitative support of the overall general factor of directive guidance and subscales of instruction and feedback and goal-setting. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. The Modularized Software Package ASKI - Full Waveform Inversion Based on Waveform Sensitivity Kernels Utilizing External Seismic Wave Propagation Codes

    Schumacher, F.; Friederich, W.

    2015-12-01

    We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full

  10. CT and MR perfusion can discriminate severe cerebral hypoperfusion from perfusion absence: evaluation of different commercial software packages by using digital phantoms

    Uwano, Ikuko; Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Advanced Medical Research Center, Morioka (Japan); Christensen, Soren [University of Melbourne, Royal Melbourne Hospital, Departments of Neurology and Radiology, Victoria (Australia); Oestergaard, Leif [Aarhus University Hospital, Department of Neuroradiology, Center for Functionally Integrative Neuroscience, DK, Aarhus C (Denmark); Ogasawara, Kuniaki; Ogawa, Akira [Iwate Medical University, Department of Neurosurgery, Morioka (Japan)

    2012-05-15

    Computed tomography perfusion (CTP) and magnetic resonance perfusion (MRP) are expected to be usable for ancillary tests of brain death by detection of complete absence of cerebral perfusion; however, the detection limit of hypoperfusion has not been determined. Hence, we examined whether commercial software can visualize very low cerebral blood flow (CBF) and cerebral blood volume (CBV) by creating and using digital phantoms. Digital phantoms simulating 0-4% of normal CBF (60 mL/100 g/min) and CBV (4 mL/100 g/min) were analyzed by ten software packages of CT and MRI manufacturers. Region-of-interest measurements were performed to determine whether there was a significant difference between areas of 0% and areas of 1-4% of normal flow. The CTP software detected hypoperfusion down to 2-3% in CBF and 2% in CBV, while the MRP software detected that of 1-3% in CBF and 1-4% in CBV, although the lower limits varied among software packages. CTP and MRP can detect the difference between profound hypoperfusion of <5% from that of 0% in digital phantoms, suggesting their potential efficacy for assessing brain death. (orig.)

  11. A comparison of six software packages for evaluation of solid lung nodules using semi-automated volumetry: What is the minimum increase in size to detect growth in repeated CT examinations

    Hoop, Bartjan de; Gietema, Hester; Prokop, Mathias; Ginneken, Bram van; Zanen, Pieter; Groenewegen, Gerard

    2009-01-01

    We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules ≥8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages. (orig.)

  12. High performance conductometry

    Saha, B.

    2000-01-01

    Inexpensive but high performance systems have emerged progressively for basic and applied measurements in physical and analytical chemistry on one hand, and for on-line monitoring and leak detection in plants and facilities on the other. Salient features of the developments will be presented with specific examples

  13. High performance systems

    Vigil, M.B. [comp.

    1995-03-01

    This document provides a written compilation of the presentations and viewgraphs from the 1994 Conference on High Speed Computing given at the High Speed Computing Conference, {open_quotes}High Performance Systems,{close_quotes} held at Gleneden Beach, Oregon, on April 18 through 21, 1994.

  14. Danish High Performance Concretes

    Nielsen, M. P.; Christoffersen, J.; Frederiksen, J.

    1994-01-01

    In this paper the main results obtained in the research program High Performance Concretes in the 90's are presented. This program was financed by the Danish government and was carried out in cooperation between The Technical University of Denmark, several private companies, and Aalborg University...... concretes, workability, ductility, and confinement problems....

  15. High performance homes

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    . Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  16. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    Ilander, T; Kansanaho, A; Toivonen, H

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.).

  17. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    Ilander, T.; Kansanaho, A.; Toivonen, H.

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.)

  18. High-Performance Networking

    CERN. Geneva

    2003-01-01

    The series will start with an historical introduction about what people saw as high performance message communication in their time and how that developed to the now to day known "standard computer network communication". It will be followed by a far more technical part that uses the High Performance Computer Network standards of the 90's, with 1 Gbit/sec systems as introduction for an in depth explanation of the three new 10 Gbit/s network and interconnect technology standards that exist already or emerge. If necessary for a good understanding some sidesteps will be included to explain important protocols as well as some necessary details of concerned Wide Area Network (WAN) standards details including some basics of wavelength multiplexing (DWDM). Some remarks will be made concerning the rapid expanding applications of networked storage.

  19. High Performance Concrete

    Traian Oneţ

    2009-01-01

    Full Text Available The paper presents the last studies and researches accomplished in Cluj-Napoca related to high performance concrete, high strength concrete and self compacting concrete. The purpose of this paper is to raid upon the advantages and inconveniences when a particular concrete type is used. Two concrete recipes are presented, namely for the concrete used in rigid pavement for roads and another one for self-compacting concrete.

  20. High performance polymeric foams

    Gargiulo, M.; Sorrentino, L.; Iannace, S.

    2008-01-01

    The aim of this work was to investigate the foamability of high-performance polymers (polyethersulfone, polyphenylsulfone, polyetherimide and polyethylenenaphtalate). Two different methods have been used to prepare the foam samples: high temperature expansion and two-stage batch process. The effects of processing parameters (saturation time and pressure, foaming temperature) on the densities and microcellular structures of these foams were analyzed by using scanning electron microscopy

  1. PC-CIMACT. A near real time materials accountancy software package for use on an IBM or compatible PC

    Williams, D.E.; Gale, R.

    1990-03-01

    This report describes the 'PC-CIMACT' Near Real Time Materials Accountancy computer package. It has been derived from 'CIMACT', which is in daily use at the UKAEA's Dounreay Nuclear Power Establishment. The scope of the package is presented, together with the statistical analyses it encompasses. Several of the analyses are illustrated by the treatment of data from a simulated reprocessing campaign. A user guide providing detailed instructions is also included. (author)

  2. The Next Generation in Subsidence and Aquifer-System Compaction Modeling within the MODFLOW Software Family: A New Package for MODFLOW-2005 and MODFLOW-OWHM

    Boyce, S. E.; Leake, S. A.; Hanson, R. T.; Galloway, D. L.

    2015-12-01

    The Subsidence and Aquifer-System Compaction Packages, SUB and SUB-WT, for MODFLOW are two currently supported subsidence packages within the MODFLOW family of software. The SUB package allows the calculation of instantaneous and delayed releases of water from distributed interbeds (relatively more compressible fine-grained sediments) within a saturated aquifer system or discrete confining beds. The SUB-WT package does not include delayed releases, but does perform a more rigorous calculation of vertical stresses that can vary the effective stress that causes compaction. This calculation of instantaneous compaction can include the effect of water-table fluctuations for unconfined aquifers on effective stress, and can optionally adjust the elastic and inelastic storage properties based on the changes in effective stress. The next generation of subsidence modeling in MODFLOW is under development, and will merge and enhance the capabilities of the SUB and SUB-WT Packages for MODFLOW-2005 and MODFLOW-OWHM. This new version will also provide some additional features such as stress dependent vertical hydraulic conductivity of interbeds, time-varying geostatic loads, and additional attributes related to aquifer-system compaction and subsidence that will broaden the class of problems that can be simulated. The new version will include a redesigned source code, a new user friendly input file structure, more output options, and new subsidence solution options. This presentation will discuss progress in developing the new package and the new features being implemented and their potential applications. By Stanley Leake, Scott E. Boyce, Randall T. Hanson, and Devin Galloway

  3. Cost optimization of load carrying thin-walled precast high performance concrete sandwich panels

    Hodicky, Kamil; Hansen, Sanne; Hulin, Thomas

    2015-01-01

    and HPCSP’s geometrical parameters as well as on material cost function in the HPCSP design. Cost functions are presented for High Performance Concrete (HPC), insulation layer, reinforcement and include labour-related costs. The present study reports the economic data corresponding to specific manufacturing......The paper describes a procedure to find the structurally and thermally efficient design of load-carrying thin-walled precast High Performance Concrete Sandwich Panels (HPCSP) with an optimal economical solution. A systematic optimization approach is based on the selection of material’s performances....... The solution of the optimization problem is performed in the computer package software Matlab® with SQPlab package and integrates the processes of HPCSP design, quantity take-off and cost estimation. The proposed optimization process outcomes in complex HPCSP design proposals to achieve minimum cost of HPCSP....

  4. High-performance computing — an overview

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  5. Clojure high performance programming

    Kumar, Shantanu

    2013-01-01

    This is a short, practical guide that will teach you everything you need to know to start writing high performance Clojure code.This book is ideal for intermediate Clojure developers who are looking to get a good grip on how to achieve optimum performance. You should already have some experience with Clojure and it would help if you already know a little bit of Java. Knowledge of performance analysis and engineering is not required. For hands-on practice, you should have access to Clojure REPL with Leiningen.

  6. PAINeT: An object-oriented software package for simulations of flow-field, transport coefficients and flux terms in non-equilibrium gas mixture flows

    Istomin, V. A.

    2018-05-01

    The software package Planet Atmosphere Investigator of Non-equilibrium Thermodynamics (PAINeT) has been devel-oped for studying the non-equilibrium effects associated with electronic excitation, chemical reactions and ionization. These studies are necessary for modeling process in shock tubes, in high enthalpy flows, in nozzles or jet engines, in combustion and explosion processes, in modern plasma-chemical and laser technologies. The advantages and possibilities of the package implementation are stated. Within the framework of the package implementation, based on kinetic theory approximations (one-temperature and state-to-state approaches), calculations are carried out, and the limits of applicability of a simplified description of shock-heated air flows and any other mixtures chosen by the user are given. Using kinetic theory algorithms, a numerical calculation of the heat fluxes and relaxation terms can be performed, which is necessary for further comparison of engineering simulation with experi-mental data. The influence of state-to-state distributions over electronic energy levels on the coefficients of thermal conductivity, diffusion, heat fluxes and diffusion velocities of the components of various gas mixtures behind shock waves is studied. Using the software package the accuracy of different approximations of the kinetic theory of gases is estimated. As an example state-resolved atomic ionized mixture of N/N+/O/O+/e- is considered. It is shown that state-resolved diffusion coefficients of neutral and ionized species vary from level to level. Comparing results of engineering applications with those given by PAINeT, recommendations for adequate models selection are proposed.

  7. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  8. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  9. pyLIMA: An Open-source Package for Microlensing Modeling. I. Presentation of the Software and Analysis of Single-lens Models

    Bachelet, E.; Norbury, M.; Bozza, V.; Street, R.

    2017-11-01

    Microlensing is a unique tool, capable of detecting the “cold” planets between ˜1 and 10 au from their host stars and even unbound “free-floating” planets. This regime has been poorly sampled to date owing to the limitations of alternative planet-finding methods, but a watershed in discoveries is anticipated in the near future thanks to the planned microlensing surveys of WFIRST-AFTA and Euclid's Extended Mission. Of the many challenges inherent in these missions, the modeling of microlensing events will be of primary importance, yet it is often time-consuming, complex, and perceived as a daunting barrier to participation in the field. The large scale of future survey data products will require thorough but efficient modeling software, but, unlike other areas of exoplanet research, microlensing currently lacks a publicly available, well-documented package to conduct this type of analysis. We present version 1.0 of the python Lightcurve Identification and Microlensing Analysis (pyLIMA). This software is written in Python and uses existing packages as much as possible to make it widely accessible. In this paper, we describe the overall architecture of the software and the core modules for modeling single-lens events. To verify the performance of this software, we use it to model both real data sets from events published in the literature and generated test data produced using pyLIMA's simulation module. The results demonstrate that pyLIMA is an efficient tool for microlensing modeling. We will expand pyLIMA to consider more complex phenomena in the following papers.

  10. Validated High Performance Liquid Chromatography Method for ...

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  11. High performance computing on vector systems

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  12. Some design constraints required for the use of generic software in embedded systems: Packages which manage abstract dynamic structures without the need for garbage collection

    Johnson, Charles S.

    1986-01-01

    The embedded systems running real-time applications, for which Ada was designed, require their own mechanisms for the management of dynamically allocated storage. There is a need for packages which manage their own internalo structures to control their deallocation as well, due to the performance implications of garbage collection by the KAPSE. This places a requirement upon the design of generic packages which manage generically structured private types built-up from application-defined input types. These kinds of generic packages should figure greatly in the development of lower-level software such as operating systems, schedulers, controllers, and device driver; and will manage structures such as queues, stacks, link-lists, files, and binary multary (hierarchical) trees. Controlled to prevent inadvertent de-designation of dynamic elements, which is implicit in the assignment operation A study was made of the use of limited private type, in solving the problems of controlling the accumulation of anonymous, detached objects in running systems. The use of deallocator prodecures for run-down of application-defined input types during deallocation operations during satellites.

  13. METEOR v1.0 - Design and structure of the software package; METEOR v1.0 - Estructura y modulos informaticos

    Palomo, E.

    1994-07-01

    This script describes the structure and the separated modules of the software package METEOR for the statistical analysis of meteorological data series. It contains a systematic description of the subroutines of METEOR and, also, of the required shape for input and output files. The original version of METEOR have been developed by Ph.D. Elena Palomo, CIEMAT-IER, GIMASE. It is built by linking programs and routines written in FORTRAN 77 and it adds thc graphical capabilities of GNUPLOT. The shape of this toolbox was designed following the criteria of modularity, flexibility and agility criteria. All the input, output and analysis options are structured in three main menus: i) the first is aimed to evaluate the quality of the data set; ii) the second is aimed for pre-processing of the data; and iii) the third is aimed towards the statistical analyses and for creating the graphical outputs. Actually the information about METEOR is constituted by three documents written in spanish: 1) METEOR v1.0: User's guide; 2) METEOR v1.0: A usage example; 3) METEOR v 1.0: Design and structure of the software package. (Author)

  14. High performance sapphire windows

    Bates, Stephen C.; Liou, Larry

    1993-02-01

    High-quality, wide-aperture optical access is usually required for the advanced laser diagnostics that can now make a wide variety of non-intrusive measurements of combustion processes. Specially processed and mounted sapphire windows are proposed to provide this optical access to extreme environment. Through surface treatments and proper thermal stress design, single crystal sapphire can be a mechanically equivalent replacement for high strength steel. A prototype sapphire window and mounting system have been developed in a successful NASA SBIR Phase 1 project. A large and reliable increase in sapphire design strength (as much as 10x) has been achieved, and the initial specifications necessary for these gains have been defined. Failure testing of small windows has conclusively demonstrated the increased sapphire strength, indicating that a nearly flawless surface polish is the primary cause of strengthening, while an unusual mounting arrangement also significantly contributes to a larger effective strength. Phase 2 work will complete specification and demonstration of these windows, and will fabricate a set for use at NASA. The enhanced capabilities of these high performance sapphire windows will lead to many diagnostic capabilities not previously possible, as well as new applications for sapphire.

  15. A software package to construct polynomial sets over Z2 for determining the output of quantum computations

    Gerdt, Vladimir P.; Severyanov, Vasily M.

    2006-01-01

    A C package is presented that allows a user for an input quantum circuit to generate a set of multivariate polynomials over the finite field Z 2 whose total number of solutions in Z 2 determines the output of the quantum computation defined by the circuit. The generated polynomial system can further be converted to the canonical Grobner basis form which provides a universal algorithmic tool for counting the number of common roots of the polynomials

  16. Multi-Language Programming Environments for High Performance Java Computing

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  17. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results

  18. High performance parallel I/O

    Prabhat

    2014-01-01

    Gain Critical Insight into the Parallel I/O EcosystemParallel I/O is an integral component of modern high performance computing (HPC), especially in storing and processing very large datasets to facilitate scientific discovery. Revealing the state of the art in this field, High Performance Parallel I/O draws on insights from leading practitioners, researchers, software architects, developers, and scientists who shed light on the parallel I/O ecosystem.The first part of the book explains how large-scale HPC facilities scope, configure, and operate systems, with an emphasis on choices of I/O har

  19. Problems in Analyzing Time Series with Gaps and Their Solution with the WinABD Software Package

    Desherevskii, A. V.; Zhuravlev, V. I.; Nikolsky, A. N.; Sidorin, A. Ya.

    2017-12-01

    Technologies for the analysis of time series with gaps are considered. Some algorithms of signal extraction (purification) and evaluation of its characteristics, such as rhythmic components, are discussed for series with gaps. Examples are given for the analysis of data obtained during long-term observations at the Garm geophysical test site and in other regions. The technical solutions used in the WinABD software are considered to most efficiently arrange the operation of relevant algorithms in the presence of observational defects.

  20. Software Reviews.

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  1. Application of a statistical software package for analysis of large patient dose data sets obtained from RIS

    Fazakerley, J.; Charnock, P.; Wilde, R.; Jones, R.; Ward, M.

    2010-01-01

    For the purpose of patient dose audit, clinical audit and radiology workload analysis, data from Radiology Information Systems (RIS) at many hospitals are collected using a database and the analysis was automated using a statistical package and Visual Basic coding. The database is a Structured Query Language database, which can be queried using an off-the-shelf statistical package, Statistica. Macros were created to automatically format the data to a consistent format between different hospitals ready for analysis. These macros can also be used to automate further analysis such as detailing mean kV, mAs and entrance surface dose per room and per gender. Standard deviation and standard error of the mean are also generated. Graphs can also be generated to illustrate the trends in doses between different variables such as room and gender. Collectively, this information can be used to generate a report. A process that once could take up to 1 d to complete now takes around 1 h. A major benefit in providing the service to hospital trusts is that less resource is now required to report on RIS data, making the possibility of continuous dose audit more likely. Time that was spent on sorting through data can now be spent on improving the analysis to provide benefit to the customer. Using data sets from RIS is a good way to perform dose audits as the huge numbers of data available provide the bases for very accurate analysis. Using macros written in Statistica Visual Basic has helped sort and consistently analyse these data. Being able to analyse by exposure factors has provided a more detailed report to the customer. (authors)

  2. CIRCE2/DEKGEN2: A software package for facilitated optical analysis of 3-D distributed solar energy concentrators. Theory and user manual

    Romero, V.J.

    1994-03-01

    CIRCE2 is a computer code for modeling the optical performance of three-dimensional dish-type solar energy concentrators. Statistical methods are used to evaluate the directional distribution of reflected rays from any given point on the concentrator. Given concentrator and receiver geometries, sunshape (angular distribution of incident rays from the sun), and concentrator imperfections such as surface roughness and random deviation in slope, the code predicts the flux distribution and total power incident upon the target. Great freedom exists in the variety of concentrator and receiver configurations that can be modeled. Additionally, provisions for shading and receiver aperturing are included.- DEKGEN2 is a preprocessor designed to facilitate input of geometry, error distributions, and sun models. This manual describes the optical model, user inputs, code outputs, and operation of the software package. A user tutorial is included in which several collectors are built and analyzed in step-by-step examples.

  3. Simulation of the target-oriented driving of an autonomous vehicle in a labyrinthic environment by means of the KISMET software package

    Knueppel, H.; Kuehnapfel, U.; Smidt, D.

    1991-10-01

    By using the special capabilities of the KISMET software-package and hardware for geometric operations and graphical presentation, an algorithm for the collision-free target-oriented driving of an autonomous vehicle was developed, implemented and linked to KISMET. The algorithm employs a simple global route-planner. It creates the global path neglecting the finite vehicle dimensions as input to the sensor-based local route-planner. The local planner for each time step transforms the sensor pattern, received by a number of ultrasonic sensors, to the movement-pattern. The target oriented global information influences the local operations. Some examples and a video demonstrate, the target will be reached collision free and close to the shortest path even in a labyrinthic environment. (orig.) [de

  4. DFTBaby: A software package for non-adiabatic molecular dynamics simulations based on long-range corrected tight-binding TD-DFT(B)

    Humeniuk, Alexander; Mitrić, Roland

    2017-12-01

    A software package, called DFTBaby, is published, which provides the electronic structure needed for running non-adiabatic molecular dynamics simulations at the level of tight-binding DFT. A long-range correction is incorporated to avoid spurious charge transfer states. Excited state energies, their analytic gradients and scalar non-adiabatic couplings are computed using tight-binding TD-DFT. These quantities are fed into a molecular dynamics code, which integrates Newton's equations of motion for the nuclei together with the electronic Schrödinger equation. Non-adiabatic effects are included by surface hopping. As an example, the program is applied to the optimization of excited states and non-adiabatic dynamics of polyfluorene. The python and Fortran source code is available at http://www.dftbaby.chemie.uni-wuerzburg.de.

  5. A user's guide to the GoldSim/BLT-MS integrated software package:a low-level radioactive waste disposal performance assessment model

    Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.

    2007-01-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low

  6. PONDEROSA-C/S: client–server based software package for automated protein 3D structure determination

    Lee, Woonghee; Stark, Jaime L.; Markley, John L.

    2014-01-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727–1728. doi:10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nucle...

  7. Using a commercial mathematics software package for on-line analysis at the BNL Accelerator Test Facility

    Malone, R.; Wang, X.J.

    1999-01-01

    BY WRITING BOTH A CUSTOM WINDOWS(NTTM) DYNAMIC LINK LIBRARY AND GENERIC COMPANION SERVER SOFTWARE, THE INTRINSIC FUNCTIONS OF MATHSOFT MATHCAD(TM) HAVE BEEN EXTENDED WITH NEW CAPABILITIES WHICH PERMIT DIRECT ACCESS TO THE CONTROL SYSTEM DATABASES OF BROOKHAVEN NATIONAL LABORATORY ACCELERATOR TEST FACILITY. UNDER THIS SCHEME, A MATHCAD WORKSHEET EXECUTING ON A PERSONAL COMPUTER BECOMES A CLIENT WHICH CAN BOTH IMPORT AND EXPORT DATA TO A CONTROL SYSTEM SERVER VIA A NETWORK STREAM SOCKET CONNECTION. THE RESULT IS AN ALTERNATIVE, MATHEMATICALLY ORIENTED VIEW OF CONTROLLING THE ACCELERATOR INTERACTIVELY

  8. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  9. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    Lopez-Rendon, X.; Bosmans, H.; Zanca, F.; Oyen, R.

    2015-01-01

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  10. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    Lopez-Rendon, X. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); Bosmans, H.; Zanca, F. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Oyen, R. [University Hospitals Leuven, Department of Radiology, Leuven (Belgium)

    2015-07-15

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  11. Architecting Web Sites for High Performance

    Arun Iyengar

    2002-01-01

    Full Text Available Web site applications are some of the most challenging high-performance applications currently being developed and deployed. The challenges emerge from the specific combination of high variability in workload characteristics and of high performance demands regarding the service level, scalability, availability, and costs. In recent years, a large body of research has addressed the Web site application domain, and a host of innovative software and hardware solutions have been proposed and deployed. This paper is an overview of recent solutions concerning the architectures and the software infrastructures used in building Web site applications. The presentation emphasizes three of the main functions in a complex Web site: the processing of client requests, the control of service levels, and the interaction with remote network caches.

  12. High Performance Computing Operations Review Report

    Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-19

    The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.

  13. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  14. Playa: High-Performance Programmable Linear Algebra

    Victoria E. Howle

    2012-01-01

    Full Text Available This paper introduces Playa, a high-level user interface layer for composing algorithms for complex multiphysics problems out of objects from other Trilinos packages. Among other features, Playa provides very high-performance overloaded operators implemented through an expression template mechanism. In this paper, we give an overview of the central Playa objects from a user's perspective, show application to a sequence of increasingly complex solver algorithms, provide timing results for Playa's overloaded operators and other functions, and briefly survey some of the implementation issues involved.

  15. High-performance mass storage system for workstations

    Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.

    1993-01-01

    media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).

  16. Enabling High-Performance Computing as a Service

    AbdelBaky, Moustafa; Parashar, Manish; Kim, Hyunjoo; Jordan, Kirk E.; Sachdeva, Vipin; Sexton, James; Jamjoom, Hani; Shae, Zon-Yin; Pencheva, Gergina; Tavakoli, Reza; Wheeler, Mary F.

    2012-01-01

    With the right software infrastructure, clouds can provide scientists with as a service access to high-performance computing resources. An award-winning prototype framework transforms the Blue Gene/P system into an elastic cloud to run a

  17. Effectiveness and efficiency of training in digital healthcare packages: training doctors to use digital medical record keeping software.

    Benwell, Nicola; Hird, Kathryn; Thomas, Nicholas; Furness, Erin; Fear, Mark; Sweetman, Greg

    2017-10-01

    Objective Fiona Stanley Hospital (FSH) is the first hospital in Western Australia to implement a digital medical record (BOSSnet, Core Medical Solutions, Australia). Formal training in the use of the digital medical record is provided to all staff as part of the induction program. The aim of the present study was to evaluate whether the current training program facilitates efficient and accurate use of the digital medical record in clinical practice. Methods Participants were selected from the cohort of junior doctors employed at FSH in 2015. An e-Learning package of clinically relevant tasks from the digital medical record was created and, along with a questionnaire, completed by participants on two separate occasions. The time taken to complete all tasks and the number of incorrect mouse clicks used to complete each task were recorded and used as measures of efficiency and accuracy respectively. Results Most participants used BOSSnet more than 10 times per day in their clinical roles and self-rated their baseline overall computer proficiency level as high. There was a significant increase in the self-rating of proficiency levels in successive tests. In addition, a significant improvement in both efficiency and accuracy for all participants was measured between the two tests. Interestingly, both groups ended up with similar accuracy on the second trial, despite the second group of participants starting with significantly poorer accuracy. Conclusions Overall, the greatest improvements in task performance followed daily ward-based experience using BOSSnet rather than formalised training. The greatest benefits of training were noted when training was delivered in close proximity to the onset of employment. What is known about the topic? Formalised training in the use of information and communications technology (ICT) is widespread in the health service. However, there is limited evidence to support the modes of learning typically used. Formalised training is often

  18. EQPT, a data file preprocessor for the EQ3/6 software package: User's guide and related documentation (Version 7.0)

    Daveler, S.A.; Wolery, T.J.

    1992-01-01

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer's (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25 degrees C only to 0-300 degrees C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer's equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer's equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers

  19. EQPT, a data file preprocessor for the EQ3/6 software package: User`s guide and related documentation (Version 7.0); Part 2

    Daveler, S.A.; Wolery, T.J.

    1992-12-17

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.

  20. TF-finder: A software package for identifying transcription factors involved in biological processes using microarray data and existing knowledge base

    Cui Xiaoqi

    2010-08-01

    Full Text Available Abstract Background Identification of transcription factors (TFs involved in a biological process is the first step towards a better understanding of the underlying regulatory mechanisms. However, due to the involvement of a large number of genes and complicated interactions in a gene regulatory network (GRN, identification of the TFs involved in a biology process remains to be very challenging. In reality, the recognition of TFs for a given a biological process can be further complicated by the fact that most eukaryotic genomes encode thousands of TFs, which are organized in gene families of various sizes and in many cases with poor sequence conservation except for small conserved domains. This poses a significant challenge for identification of the exact TFs involved or ranking the importance of a set of TFs to a process of interest. Therefore, new methods for recognizing novel TFs are desperately needed. Although a plethora of methods have been developed to infer regulatory genes using microarray data, it is still rare to find the methods that use existing knowledge base in particular the validated genes known to be involved in a process to bait/guide discovery of novel TFs. Such methods can replace the sometimes-arbitrary process of selection of candidate genes for experimental validation and significantly advance our knowledge and understanding of the regulation of a process. Results We developed an automated software package called TF-finder for recognizing TFs involved in a biological process using microarray data and existing knowledge base. TF-finder contains two components, adaptive sparse canonical correlation analysis (ASCCA and enrichment test, for TF recognition. ASCCA uses positive target genes to bait TFS from gene expression data while enrichment test examines the presence of positive TFs in the outcomes from ASCCA. Using microarray data from salt and water stress experiments, we showed TF-finder is very efficient in recognizing

  1. Optical Thermal Characterization Enables High-Performance Electronics Applications

    2016-02-01

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  2. FDSTools: A software package for analysis of massively parallel sequencing data with the ability to recognise and correct STR stutter and other PCR or sequencing noise.

    Hoogenboom, Jerry; van der Gaag, Kristiaan J; de Leeuw, Rick H; Sijen, Titia; de Knijff, Peter; Laros, Jeroen F J

    2017-03-01

    Massively parallel sequencing (MPS) is on the advent of a broad scale application in forensic research and casework. The improved capabilities to analyse evidentiary traces representing unbalanced mixtures is often mentioned as one of the major advantages of this technique. However, most of the available software packages that analyse forensic short tandem repeat (STR) sequencing data are not well suited for high throughput analysis of such mixed traces. The largest challenge is the presence of stutter artefacts in STR amplifications, which are not readily discerned from minor contributions. FDSTools is an open-source software solution developed for this purpose. The level of stutter formation is influenced by various aspects of the sequence, such as the length of the longest uninterrupted stretch occurring in an STR. When MPS is used, STRs are evaluated as sequence variants that each have particular stutter characteristics which can be precisely determined. FDSTools uses a database of reference samples to determine stutter and other systemic PCR or sequencing artefacts for each individual allele. In addition, stutter models are created for each repeating element in order to predict stutter artefacts for alleles that are not included in the reference set. This information is subsequently used to recognise and compensate for the noise in a sequence profile. The result is a better representation of the true composition of a sample. Using Promega Powerseq™ Auto System data from 450 reference samples and 31 two-person mixtures, we show that the FDSTools correction module decreases stutter ratios above 20% to below 3%. Consequently, much lower levels of contributions in the mixed traces are detected. FDSTools contains modules to visualise the data in an interactive format allowing users to filter data with their own preferred thresholds. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  3. High Performance Electronics on Flexible Silicon

    Sevilla, Galo T.

    2016-09-01

    Over the last few years, flexible electronic systems have gained increased attention from researchers around the world because of their potential to create new applications such as flexible displays, flexible energy harvesters, artificial skin, and health monitoring systems that cannot be integrated with conventional wafer based complementary metal oxide semiconductor processes. Most of the current efforts to create flexible high performance devices are based on the use of organic semiconductors. However, inherent material\\'s limitations make them unsuitable for big data processing and high speed communications. The objective of my doctoral dissertation is to develop integration processes that allow the transformation of rigid high performance electronics into flexible ones while maintaining their performance and cost. In this work, two different techniques to transform inorganic complementary metal-oxide-semiconductor electronics into flexible ones have been developed using industry compatible processes. Furthermore, these techniques were used to realize flexible discrete devices and circuits which include metal-oxide-semiconductor field-effect-transistors, the first demonstration of flexible Fin-field-effect-transistors, and metal-oxide-semiconductors-based circuits. Finally, this thesis presents a new technique to package, integrate, and interconnect flexible high performance electronics using low cost additive manufacturing techniques such as 3D printing and inkjet printing. This thesis contains in depth studies on electrical, mechanical, and thermal properties of the fabricated devices.

  4. Summary Describing Integration of ERM Methodology into Supervisory Control Framework with Software Package Documentation; Advanced Reactor Technology Milestone: M4AT-16PN2301052

    Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hirt, Evelyn H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dib, Gerges [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Veeramany, Arun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bonebrake, Christopher A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Roy, Surajit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-20

    This project involved the development of enhanced risk monitors (ERMs) for active components in Advanced Reactor (AdvRx) designs by integrating real-time information about equipment condition with risk monitors. Health monitoring techniques in combination with predictive estimates of component failure based on condition and risk monitors can serve to indicate the risk posed by continued operation in the presence of detected degradation. This combination of predictive health monitoring based on equipment condition assessment and risk monitors can also enable optimization of maintenance scheduling with respect to the economics of plant operation. This report summarizes PNNL’s multi-year project on the development and evaluation of an ERM concept for active components while highlighting FY2016 accomplishments. Specifically, this report provides a status summary of the integration and demonstration of the prototypic ERM framework with the plant supervisory control algorithms being developed at Oak Ridge National Laboratory (ORNL), and describes additional case studies conducted to assess sensitivity of the technology to different quantities. Supporting documentation on the software package to be provided to ONRL is incorporated in this report.

  5. Software package developments around TAPS multidetector: on-line management of GANIL data; mesons neutral identification with the help of neural networks

    Lefevre, F.

    1993-02-01

    The photon multidetector system TAPS, a European collaboration, was installed for the second series of experiments at GANIL in the fall of 1992. It was used in conjunction with a multidetector for charged particles and the high resolution spectrometer SPEG. This experimental set-up is described. A dedicated software package, written in the PAW environment, for the online control and analysis of data has been developed and is described in detail. One aspect of the TAPS experimental program involves the detection of neutral mesons via two-photon decay. The identification by this decay channel is not trivial due to the so-called combinatorial background-the generation of photon pairs not associated with a meson decay. A method based on a neural network has been developed in order to aid in the extraction of the meson signal. The method is based on that of Hopfield and has been modified to incorporate the self-connection of cells. Our network is thus well suited to solve optimization problems where the initial state of the system represents the data constituting the problem. The performance of the network is presented using simulations and it is demonstrated that the signal-to-noise ratio can be improved given constraints on the solid angle of the detector and the correct identification of the photons

  6. FLIMX: A Software Package to Determine and Analyze the Fluorescence Lifetime in Time-Resolved Fluorescence Data from the Human Eye.

    Matthias Klemm

    Full Text Available Fluorescence lifetime imaging ophthalmoscopy (FLIO is a new technique for measuring the in vivo autofluorescence intensity decays generated by endogenous fluorophores in the ocular fundus. Here, we present a software package called FLIM eXplorer (FLIMX for analyzing FLIO data. Specifically, we introduce a new adaptive binning approach as an optimal tradeoff between the spatial resolution and the number of photons required per pixel. We also expand existing decay models (multi-exponential, stretched exponential, spectral global analysis, incomplete decay to account for the layered structure of the eye and present a method to correct for the influence of the crystalline lens fluorescence on the retina fluorescence. Subsequently, the Holm-Bonferroni method is applied to FLIO measurements to allow for group comparisons between patients and controls on the basis of fluorescence lifetime parameters. The performance of the new approaches was evaluated in five experiments. Specifically, we evaluated static and adaptive binning in a diabetes mellitus patient, we compared the different decay models in a healthy volunteer and performed a group comparison between diabetes patients and controls. An overview of the visualization capabilities and a comparison of static and adaptive binning is shown for a patient with macular hole. FLIMX's applicability to fluorescence lifetime imaging microscopy is shown in the ganglion cell layer of a porcine retina sample, obtained by a laser scanning microscope using two-photon excitation.

  7. FLIMX: A Software Package to Determine and Analyze the Fluorescence Lifetime in Time-Resolved Fluorescence Data from the Human Eye.

    Klemm, Matthias; Schweitzer, Dietrich; Peters, Sven; Sauer, Lydia; Hammer, Martin; Haueisen, Jens

    2015-01-01

    Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique for measuring the in vivo autofluorescence intensity decays generated by endogenous fluorophores in the ocular fundus. Here, we present a software package called FLIM eXplorer (FLIMX) for analyzing FLIO data. Specifically, we introduce a new adaptive binning approach as an optimal tradeoff between the spatial resolution and the number of photons required per pixel. We also expand existing decay models (multi-exponential, stretched exponential, spectral global analysis, incomplete decay) to account for the layered structure of the eye and present a method to correct for the influence of the crystalline lens fluorescence on the retina fluorescence. Subsequently, the Holm-Bonferroni method is applied to FLIO measurements to allow for group comparisons between patients and controls on the basis of fluorescence lifetime parameters. The performance of the new approaches was evaluated in five experiments. Specifically, we evaluated static and adaptive binning in a diabetes mellitus patient, we compared the different decay models in a healthy volunteer and performed a group comparison between diabetes patients and controls. An overview of the visualization capabilities and a comparison of static and adaptive binning is shown for a patient with macular hole. FLIMX's applicability to fluorescence lifetime imaging microscopy is shown in the ganglion cell layer of a porcine retina sample, obtained by a laser scanning microscope using two-photon excitation.

  8. Monitoring SLAC High Performance UNIX Computing Systems

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  9. Nonparametric Statistics Test Software Package.

    1983-09-01

    25 I1l,lCELL WRITE (NCF,12 ) IvE (I ,RCCT(I) 122 FORMAT(IlXt 3(H5 9 1) IF( IeLT *NCELL) WRITE (NOF1123 J PARTV(I1J 123 FORMAT( Xll----’,FIo.3J 25 CONT...the user’s entries. Its purpose is to write two types of files needed by the program Crunch: the data file, and the option file. 211 Iuill rateLchiavar...data file and communicate the choice of test and test parameters to Crunch. After a data file is written, Lochinvar prompts the writing of the

  10. Feature selection toolbox software package

    Pudil, Pavel; Novovičová, Jana; Somol, Petr

    2002-01-01

    Roč. 23, č. 4 (2002), s. 487-492 ISSN 0167-8655 R&D Projects: GA ČR GA402/01/0981 Institutional research plan: CEZ:AV0Z1075907 Keywords : pattern recognition * feature selection * loating search algorithms Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.409, year: 2002

  11. INL High Performance Building Strategy

    Jennifer D. Morton

    2010-02-01

    High performance buildings, also known as sustainable buildings and green buildings, are resource efficient structures that minimize the impact on the environment by using less energy and water, reduce solid waste and pollutants, and limit the depletion of natural resources while also providing a thermally and visually comfortable working environment that increases productivity for building occupants. As Idaho National Laboratory (INL) becomes the nation’s premier nuclear energy research laboratory, the physical infrastructure will be established to help accomplish this mission. This infrastructure, particularly the buildings, should incorporate high performance sustainable design features in order to be environmentally responsible and reflect an image of progressiveness and innovation to the public and prospective employees. Additionally, INL is a large consumer of energy that contributes to both carbon emissions and resource inefficiency. In the current climate of rising energy prices and political pressure for carbon reduction, this guide will help new construction project teams to design facilities that are sustainable and reduce energy costs, thereby reducing carbon emissions. With these concerns in mind, the recommendations described in the INL High Performance Building Strategy (previously called the INL Green Building Strategy) are intended to form the INL foundation for high performance building standards. This revised strategy incorporates the latest federal and DOE orders (Executive Order [EO] 13514, “Federal Leadership in Environmental, Energy, and Economic Performance” [2009], EO 13423, “Strengthening Federal Environmental, Energy, and Transportation Management” [2007], and DOE Order 430.2B, “Departmental Energy, Renewable Energy, and Transportation Management” [2008]), the latest guidelines, trends, and observations in high performance building construction, and the latest changes to the Leadership in Energy and Environmental Design

  12. Decal electronics for printed high performance cmos electronic systems

    Hussain, Muhammad Mustafa

    2017-11-23

    High performance complementary metal oxide semiconductor (CMOS) electronics are critical for any full-fledged electronic system. However, state-of-the-art CMOS electronics are rigid and bulky making them unusable for flexible electronic applications. While there exist bulk material reduction methods to flex them, such thinned CMOS electronics are fragile and vulnerable to handling for high throughput manufacturing. Here, we show a fusion of a CMOS technology compatible fabrication process for flexible CMOS electronics, with inkjet and conductive cellulose based interconnects, followed by additive manufacturing (i.e. 3D printing based packaging) and finally roll-to-roll printing of packaged decal electronics (thin film transistors based circuit components and sensors) focusing on printed high performance flexible electronic systems. This work provides the most pragmatic route for packaged flexible electronic systems for wide ranging applications.

  13. High Performance Bulk Thermoelectric Materials

    Ren, Zhifeng [Boston College, Chestnut Hill, MA (United States)

    2013-03-31

    Over 13 plus years, we have carried out research on electron pairing symmetry of superconductors, growth and their field emission property studies on carbon nanotubes and semiconducting nanowires, high performance thermoelectric materials and other interesting materials. As a result of the research, we have published 104 papers, have educated six undergraduate students, twenty graduate students, nine postdocs, nine visitors, and one technician.

  14. High-Performance Operating Systems

    Sharp, Robin

    1999-01-01

    Notes prepared for the DTU course 49421 "High Performance Operating Systems". The notes deal with quantitative and qualitative techniques for use in the design and evaluation of operating systems in computer systems for which performance is an important parameter, such as real-time applications......, communication systems and multimedia systems....

  15. Procedures to analyse γ-ray spectra obtained from the ORTEC or nuclear data ND-680 system by ORTEC's analysis software packages incorporated into a separate IBM-PC computer

    Zhang Xiu Zhen.

    1990-01-01

    A detailed description is presented for processing γ-spectra produced by means of Ortec or Nuclear Data spectrometry systems on an off-line IBM-PC. The ORTEC analysis software packages were transferred to and implemented on the PC A/T, and the different spectra were recorded on discs and subsequently brought into the format required by the program for the calculation of photo peak areas. (author)

  16. OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging

    Rubel, Oliver; Greiner, Annette; Cholia, Shreyas; Louie, Katherine; Bethel, E. Wes; Northen, Trent R.; Bowen, Benjamin P.

    2013-10-02

    Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data access (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.

  17. Neo4j high performance

    Raj, Sonal

    2015-01-01

    If you are a professional or enthusiast who has a basic understanding of graphs or has basic knowledge of Neo4j operations, this is the book for you. Although it is targeted at an advanced user base, this book can be used by beginners as it touches upon the basics. So, if you are passionate about taming complex data with the help of graphs and building high performance applications, you will be able to get valuable insights from this book.

  18. Software Applications on the Peregrine System | High-Performance Computing

    Algebraic Modeling System (GAMS) Statistics and analysis High-level modeling system for mathematical reactivity. Gurobi Optimizer Statistics and analysis Solver for mathematical programming LAMMPS Chemistry and , reactivities, and vibrational, electronic and NMR spectra. R Statistical Computing Environment Statistics and

  19. Peregrine Software Toolchains | High-Performance Computing | NREL

    Group (PGI) C/C++ and Fortran (partially supported) The PGI Accelerator compilers include NVIDIA GPU support via the directive-based OpenACC 2.5 programming model, as well as full support for NVIDIA CUDA C

  20. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  1. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  2. Reviews, Software.

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  3. Software Reviews.

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  4. Software Review.

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  5. Software Reviews.

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  6. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  7. High-performance bidiagonal reduction using tile algorithms on homogeneous multicore architectures

    Ltaief, Hatem

    2013-04-01

    This article presents a new high-performance bidiagonal reduction (BRD) for homogeneous multicore architectures. This article is an extension of the high-performance tridiagonal reduction implemented by the same authors [Luszczek et al., IPDPS 2011] to the BRD case. The BRD is the first step toward computing the singular value decomposition of a matrix, which is one of the most important algorithms in numerical linear algebra due to its broad impact in computational science. The high performance of the BRD described in this article comes from the combination of four important features: (1) tile algorithms with tile data layout, which provide an efficient data representation in main memory; (2) a two-stage reduction approach that allows to cast most of the computation during the first stage (reduction to band form) into calls to Level 3 BLAS and reduces the memory traffic during the second stage (reduction from band to bidiagonal form) by using high-performance kernels optimized for cache reuse; (3) a data dependence translation layer that maps the general algorithm with column-major data layout into the tile data layout; and (4) a dynamic runtime system that efficiently schedules the newly implemented kernels across the processing units and ensures that the data dependencies are not violated. A detailed analysis is provided to understand the critical impact of the tile size on the total execution time, which also corresponds to the matrix bandwidth size after the reduction of the first stage. The performance results show a significant improvement over currently established alternatives. The new high-performance BRD achieves up to a 30-fold speedup on a 16-core Intel Xeon machine with a 12000×12000 matrix size against the state-of-the-art open source and commercial numerical software packages, namely LAPACK, compiled with optimized and multithreaded BLAS from MKL as well as Intel MKL version 10.2. © 2013 ACM.

  8. Photons, photosynthesis, and high-performance computing: challenges, progress, and promise of modeling metabolism in green algae

    Chang, C H; Graf, P; Alber, D M; Kim, K; Murray, G; Posewitz, M; Seibert, M

    2008-01-01

    The complexity associated with biological metabolism considered at a kinetic level presents a challenge to quantitative modeling. In particular, the relatively sparse knowledge of parameters for enzymes with known kinetic responses is problematic. The possible space of these parameters is of high-dimension, and sampling of such a space typifies a combinatorial explosion of possible dynamic states. However, with sufficient quantitative transcriptomics, proteomics, and metabolomics data at hand, these challenges could be met by high-performance software with sampling, fitting, and optimization capabilities. With this in mind, we present the High-Performance Systems Biology Toolkit HiPer SBTK, an evolving software package to simulate, fit, and optimize metabolite concentrations and fluxes within the space of rate and binding parameters associated with detailed enzyme kinetic models. We present our chosen modeling paradigm for the formulation of metabolic pathway models, the means to address the challenge of representing such models in a precise and persistent fashion using the standardized Systems Biology Markup Language, and our second-generation model of H2-associated Chlamydomonas metabolism. Processing of such models for hierarchically parallelized simulation and optimization, job specification by the user through a GUI interface, software capabilities and initial scaling data, and the mapping of the computation to biological questions is also discussed. Moreover, we present near-term future software and model development goals

  9. Contemporary high performance computing from petascale toward exascale

    Vetter, Jeffrey S

    2013-01-01

    Contemporary High Performance Computing: From Petascale toward Exascale focuses on the ecosystems surrounding the world's leading centers for high performance computing (HPC). It covers many of the important factors involved in each ecosystem: computer architectures, software, applications, facilities, and sponsors. The first part of the book examines significant trends in HPC systems, including computer architectures, applications, performance, and software. It discusses the growth from terascale to petascale computing and the influence of the TOP500 and Green500 lists. The second part of the

  10. An Object-Oriented Serial DSMC Simulation Package

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  11. ATLAS Software Installation on Supercomputers

    Undrus, Alexander; The ATLAS collaboration

    2018-01-01

    PowerPC and high performance computers (HPC) are important resources for computing in the ATLAS experiment. The future LHC data processing will require more resources than Grid computing, currently using approximately 100,000 cores at well over 100 sites, can provide. Supercomputers are extremely powerful as they use resources of hundreds of thousands CPUs joined together. However their architectures have different instruction sets. ATLAS binary software distributions for x86 chipsets do not fit these architectures, as emulation of these chipsets results in huge performance loss. This presentation describes the methodology of ATLAS software installation from source code on supercomputers. The installation procedure includes downloading the ATLAS code base as well as the source of about 50 external packages, such as ROOT and Geant4, followed by compilation, and rigorous unit and integration testing. The presentation reports the application of this procedure at Titan HPC and Summit PowerPC at Oak Ridge Computin...

  12. High performance parallel computers for science

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  13. Functional High Performance Financial IT

    Berthold, Jost; Filinski, Andrzej; Henglein, Fritz

    2011-01-01

    at the University of Copenhagen that attacks this triple challenge of increased performance, transparency and productivity in the financial sector by a novel integration of financial mathematics, domain-specific language technology, parallel functional programming, and emerging massively parallel hardware. HIPERFIT......The world of finance faces the computational performance challenge of massively expanding data volumes, extreme response time requirements, and compute-intensive complex (risk) analyses. Simultaneously, new international regulatory rules require considerably more transparency and external...... auditability of financial institutions, including their software systems. To top it off, increased product variety and customisation necessitates shorter software development cycles and higher development productivity. In this paper, we report about HIPERFIT, a recently etablished strategic research center...

  14. High-performance computing in seismology

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  15. High Performance Proactive Digital Forensics

    Alharbi, Soltan; Traore, Issa; Moa, Belaid; Weber-Jahnke, Jens

    2012-01-01

    With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.

  16. High performance computing in linear control

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  17. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Moon, Hongsik

    changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  18. Enabling High-Performance Computing as a Service

    AbdelBaky, Moustafa

    2012-10-01

    With the right software infrastructure, clouds can provide scientists with as a service access to high-performance computing resources. An award-winning prototype framework transforms the Blue Gene/P system into an elastic cloud to run a representative HPC application. © 2012 IEEE.

  19. Development of high performance cladding

    Kiuchi, Kiyoshi

    2003-01-01

    The developments of superior next-generation light water reactor are requested on the basis of general view points, such as improvement of safety, economics, reduction of radiation waste and effective utilization of plutonium, until 2030 year in which conventional reactor plants should be renovate. Improvements of stainless steel cladding for conventional high burn-up reactor to more than 100 GWd/t, developments of manufacturing technology for reduced moderation-light water reactor (RMWR) of breeding ratio beyond 1.0 and researches of water-materials interaction on super critical pressure-water cooled reactor are carried out in Japan Atomic Energy Research Institute. Stable austenite stainless steel has been selected for fuel element cladding of advanced boiling water reactor (ABWR). The austenite stain less has the superiority for anti-irradiation properties, corrosion resistance and mechanical strength. A hard spectrum of neutron energy up above 0.1 MeV takes place in core of the reduced moderation-light water reactor, as liquid metal-fast breeding reactor (LMFBR). High performance cladding for the RMWR fuel elements is required to get anti-irradiation properties, corrosion resistance and mechanical strength also. Slow strain rate test (SSRT) of SUS 304 and SUS 316 are carried out for studying stress corrosion cracking (SCC). Irradiation tests in LMFBR are intended to obtain irradiation data for damaged quantity of the cladding materials. (M. Suetake)

  20. High performance fuel technology development

    Koon, Yang Hyun; Kim, Keon Sik; Park, Jeong Yong; Yang, Yong Sik; In, Wang Kee; Kim, Hyung Kyu [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    {omicron} Development of High Plasticity and Annular Pellet - Development of strong candidates of ultra high burn-up fuel pellets for a PCI remedy - Development of fabrication technology of annular fuel pellet {omicron} Development of High Performance Cladding Materials - Irradiation test of HANA claddings in Halden research reactor and the evaluation of the in-pile performance - Development of the final candidates for the next generation cladding materials. - Development of the manufacturing technology for the dual-cooled fuel cladding tubes. {omicron} Irradiated Fuel Performance Evaluation Technology Development - Development of performance analysis code system for the dual-cooled fuel - Development of fuel performance-proving technology {omicron} Feasibility Studies on Dual-Cooled Annular Fuel Core - Analysis on the property of a reactor core with dual-cooled fuel - Feasibility evaluation on the dual-cooled fuel core {omicron} Development of Design Technology for Dual-Cooled Fuel Structure - Definition of technical issues and invention of concept for dual-cooled fuel structure - Basic design and development of main structure components for dual- cooled fuel - Basic design of a dual-cooled fuel rod.

  1. Packaging fluency

    Mocanu, Ana; Chrysochou, Polymeros; Bogomolova, Svetlana

    2011-01-01

    Research on packaging stresses the need for packaging design to read easily, presuming fast and accurate processing of product-related information. In this paper we define this property of packaging as “packaging fluency”. Based on the existing marketing and cognitive psychology literature on pac...

  2. Microelectronic packaging

    Datta, M; Schultze, J Walter

    2004-01-01

    Microelectronic Packaging analyzes the massive impact of electrochemical technologies on various levels of microelectronic packaging. Traditionally, interconnections within a chip were considered outside the realm of packaging technologies, but this book emphasizes the importance of chip wiring as a key aspect of microelectronic packaging, and focuses on electrochemical processing as an enabler of advanced chip metallization.Divided into five parts, the book begins by outlining the basics of electrochemical processing, defining the microelectronic packaging hierarchy, and emphasizing the impac

  3. APFELgrid: a high performance tool for parton density determinations

    Bertone, Valerio; Hartland, Nathan P.

    We present a new software package designed to reduce the computational burden of hadron collider measurements in Parton Distribution Function (PDF) fits. The APFELgrid package converts interpolated weight tables provided by APPLgrid files into a more efficient format for PDF fitting by the combination with PDF and $\\alpha_s$ evolution factors provided by APFEL. This combination significantly reduces the number of operations required to perform the calculation of hadronic observables in PDF fits and simplifies the structure of the calculation into a readily optimised scalar product. We demonstrate that our technique can lead to a substantial speed improvement when compared to existing methods without any reduction in numerical accuracy.

  4. Does HDR Pre-Processing Improve the Accuracy of 3D Models Obtained by Means of two Conventional SfM-MVS Software Packages? The Case of the Corral del Veleta Rock Glacier

    Álvaro Gómez-Gutiérrez

    2015-08-01

    Full Text Available The accuracy of different workflows using Structure-from-Motion and Multi-View-Stereo techniques (SfM-MVS is tested. Twelve point clouds of the Corral del Veleta rock glacier, in Spain, were produced with two different software packages (123D Catch and Agisoft Photoscan, using Low Dynamic Range images and High Dynamic Range compositions (HDR for three different years (2011, 2012 and 2014. The accuracy of the resulting point clouds was assessed using benchmark models acquired every year with a Terrestrial Laser Scanner. Three parameters were used to estimate the accuracy of each point cloud: the RMSE, the Cloud-to-Cloud distance (C2C and the Multiscale-Model-to-Model comparison (M3C2. The M3C2 mean error ranged from 0.084 m (standard deviation of 0.403 m to 1.451 m (standard deviation of 1.625 m. Agisoft Photoscan overcome 123D Catch, producing more accurate and denser point clouds in 11 out 12 cases, being this work, the first available comparison between both software packages in the literature. No significant improvement was observed using HDR pre-processing. To our knowledge, this is the first time that the geometrical accuracy of 3D models obtained using LDR and HDR compositions are compared. These findings may be of interest for researchers who wish to estimate geomorphic changes using SfM-MVS approaches.

  5. High performance light water reactor

    Squarer, D.; Schulenberg, T.; Struwe, D.; Oka, Y.; Bittermann, D.; Aksan, N.; Maraczy, C.; Kyrki-Rajamaeki, R.; Souyri, A.; Dumaz, P.

    2003-01-01

    The objective of the high performance light water reactor (HPLWR) project is to assess the merit and economic feasibility of a high efficiency LWR operating at thermodynamically supercritical regime. An efficiency of approximately 44% is expected. To accomplish this objective, a highly qualified team of European research institutes and industrial partners together with the University of Tokyo is assessing the major issues pertaining to a new reactor concept, under the co-sponsorship of the European Commission. The assessment has emphasized the recent advancement achieved in this area by Japan. Additionally, it accounts for advanced European reactor design requirements, recent improvements, practical design aspects, availability of plant components and the availability of high temperature materials. The final objective of this project is to reach a conclusion on the potential of the HPLWR to help sustain the nuclear option, by supplying competitively priced electricity, as well as to continue the nuclear competence in LWR technology. The following is a brief summary of the main project achievements:-A state-of-the-art review of supercritical water-cooled reactors has been performed for the HPLWR project.-Extensive studies have been performed in the last 10 years by the University of Tokyo. Therefore, a 'reference design', developed by the University of Tokyo, was selected in order to assess the available technological tools (i.e. computer codes, analyses, advanced materials, water chemistry, etc.). Design data and results of the analysis were supplied by the University of Tokyo. A benchmark problem, based on the 'reference design' was defined for neutronics calculations and several partners of the HPLWR project carried out independent analyses. The results of these analyses, which in addition help to 'calibrate' the codes, have guided the assessment of the core and the design of an improved HPLWR fuel assembly. Preliminary selection was made for the HPLWR scale

  6. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  7. NCI's Transdisciplinary High Performance Scientific Data Platform

    Evans, Ben; Antony, Joseph; Bastrakova, Irina; Car, Nicholas; Cox, Simon; Druken, Kelsey; Evans, Bradley; Fraser, Ryan; Ip, Alex; Kemp, Carina; King, Edward; Minchin, Stuart; Larraondo, Pablo; Pugh, Tim; Richards, Clare; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    The Australian National Computational Infrastructure (NCI) manages Earth Systems data collections sourced from several domains and organisations onto a single High Performance Data (HPD) Node to further Australia's national priority research and innovation agenda. The NCI HPD Node has rapidly established its value, currently managing over 10 PBytes of datasets from collections that span a wide range of disciplines including climate, weather, environment, geoscience, geophysics, water resources and social sciences. Importantly, in order to facilitate broad user uptake, maximise reuse and enable transdisciplinary access through software and standardised interfaces, the datasets, associated information systems and processes have been incorporated into the design and operation of a unified platform that NCI has called, the National Environmental Research Data Interoperability Platform (NERDIP). The key goal of the NERDIP is to regularise data access so that it is easily discoverable, interoperable for different domains and enabled for high performance methods. It adopts and implements international standards and data conventions, and promotes scientific integrity within a high performance computing and data analysis environment. NCI has established a rich and flexible computing environment to access to this data, through the NCI supercomputer; a private cloud that supports both domain focused virtual laboratories and in-common interactive analysis interfaces; as well as remotely through scalable data services. Data collections of this importance must be managed with careful consideration of both their current use and the needs of the end-communities, as well as its future potential use, such as transitioning to more advanced software and improved methods. It is therefore critical that the data platform is both well-managed and trusted for stable production use (including transparency and reproducibility), agile enough to incorporate new technological advances and

  8. Indoor Air Quality in High Performance Schools

    High performance schools are facilities that improve the learning environment while saving energy, resources, and money. The key is understanding the lifetime value of high performance schools and effectively managing priorities, time, and budget.

  9. MEMS packaging

    Hsu , Tai-Ran

    2004-01-01

    MEMS Packaging discusses the prevalent practices and enabling techniques in assembly, packaging and testing of microelectromechanical systems (MEMS). The entire spectrum of assembly, packaging and testing of MEMS and microsystems, from essential enabling technologies to applications in key industries of life sciences, telecommunications and aerospace engineering is covered. Other topics included are bonding and sealing of microcomponents, process flow of MEMS and microsystems packaging, automated microassembly, and testing and design for testing.The Institution of Engineering and Technology is

  10. Carpet Aids Learning in High Performance Schools

    Hurd, Frank

    2009-01-01

    The Healthy and High Performance Schools Act of 2002 has set specific federal guidelines for school design, and developed a federal/state partnership program to assist local districts in their school planning. According to the Collaborative for High Performance Schools (CHPS), high-performance schools are, among other things, healthy, comfortable,…

  11. Computer Center: Software Review.

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  12. Gammasphere software development

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  13. High Performance Computing - Power Application Programming Interface Specification.

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  14. High-Performance Tiled WMS and KML Web Server

    Plesea, Lucian

    2007-01-01

    This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.

  15. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  16. The fundamentals behind solving for unknown molecular structures using computer-assisted structure elucidation: a free software package at the undergraduate and graduate levels.

    Moser, Arvin; Pautler, Brent G

    2016-05-15

    The successful elucidation of an unknown compound's molecular structure often requires an analyst with profound knowledge and experience of advanced spectroscopic techniques, such as Nuclear Magnetic Resonance (NMR) spectroscopy and mass spectrometry. The implementation of Computer-Assisted Structure Elucidation (CASE) software in solving for unknown structures, such as isolated natural products and/or reaction impurities, can serve both as elucidation and teaching tools. As such, the introduction of CASE software with 112 exercises to train students in conjunction with the traditional pen and paper approach will strengthen their overall understanding of solving unknowns and explore of various structural end points to determine the validity of the results quickly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. The software developing method for multichannel computer-aided system for physical experiments control, realized by resources of national instruments LabVIEW instrumental package

    Gorskaya, E.A.; Samojlov, V.N.

    1999-01-01

    This work is describing the method of developing the computer-aided control system in integrated environment of LabVIEW. Using the object-oriented design of complex systems, the hypothetical model for methods of developing the software for computer-aided system for physical experiments control was constructed. Within the framework of that model architecture solutions and implementations of suggested method were described. (author)

  18. Development of standardized approaches to reporting of minimal residual disease data using a reporting software package designed within the European LeukemiaNet

    Ostergaard, M; Nyvold, Charlotte Guldborg; Jovanovic, J V

    2011-01-01

    Quantitative PCR (qPCR) for detection of fusion transcripts and overexpressed genes is a promising tool for following minimal residual disease (MRD) in patients with hematological malignancies. Its widespread clinical use has to some extent been hampered by differences in data analysis and presen......Quantitative PCR (qPCR) for detection of fusion transcripts and overexpressed genes is a promising tool for following minimal residual disease (MRD) in patients with hematological malignancies. Its widespread clinical use has to some extent been hampered by differences in data analysis...... and presentation that complicate multicenter clinical trials. To address these issues, we designed a highly flexible MRD-reporting software program, in which data from various qPCR platforms can be imported, processed, and presented in a uniform manner to generate intuitively understandable reports. The software...... was tested in a two-step quality control (QC) study; the first step involved eight centers, whose previous experience with the software ranged from none to extensive. The participants received cDNA from consecutive samples from a BCR-ABL+ chronic myeloid leukemia (CML) patient and an acute myeloid leukemia...

  19. Enabling high performance computational science through combinatorial algorithms

    Boman, Erik G; Bozdag, Doruk; Catalyurek, Umit V; Devine, Karen D; Gebremedhin, Assefaw H; Hovland, Paul D; Pothen, Alex; Strout, Michelle Mills

    2007-01-01

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation

  20. Enabling high performance computational science through combinatorial algorithms

    Boman, Erik G [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Bozdag, Doruk [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Catalyurek, Umit V [Biomedical Informatics, and Electrical and Computer Engineering, Ohio State University (United States); Devine, Karen D [Discrete Algorithms and Math Department, Sandia National Laboratories (United States); Gebremedhin, Assefaw H [Computer Science and Center for Computational Science, Old Dominion University (United States); Hovland, Paul D [Mathematics and Computer Science Division, Argonne National Laboratory (United States); Pothen, Alex [Computer Science and Center for Computational Science, Old Dominion University (United States); Strout, Michelle Mills [Computer Science, Colorado State University (United States)

    2007-07-15

    The Combinatorial Scientific Computing and Petascale Simulations (CSCAPES) Institute is developing algorithms and software for combinatorial problems that play an enabling role in scientific and engineering computations. Discrete algorithms will be increasingly critical for achieving high performance for irregular problems on petascale architectures. This paper describes recent contributions by researchers at the CSCAPES Institute in the areas of load balancing, parallel graph coloring, performance improvement, and parallel automatic differentiation.

  1. cgDNA: a software package for the prediction of sequence-dependent coarse-grain free energies of B-form DNA.

    Petkevičiūtė, D; Pasi, M; Gonzalez, O; Maddocks, J H

    2014-11-10

    cgDNA is a package for the prediction of sequence-dependent configuration-space free energies for B-form DNA at the coarse-grain level of rigid bases. For a fragment of any given length and sequence, cgDNA calculates the configuration of the associated free energy minimizer, i.e. the relative positions and orientations of each base, along with a stiffness matrix, which together govern differences in free energies. The model predicts non-local (i.e. beyond base-pair step) sequence dependence of the free energy minimizer. Configurations can be input or output in either the Curves+ definition of the usual helical DNA structural variables, or as a PDB file of coordinates of base atoms. We illustrate the cgDNA package by comparing predictions of free energy minimizers from (a) the cgDNA model, (b) time-averaged atomistic molecular dynamics (or MD) simulations, and (c) NMR or X-ray experimental observation, for (i) the Dickerson-Drew dodecamer and (ii) three oligomers containing A-tracts. The cgDNA predictions are rather close to those of the MD simulations, but many orders of magnitude faster to compute. Both the cgDNA and MD predictions are in reasonable agreement with the available experimental data. Our conclusion is that cgDNA can serve as a highly efficient tool for studying structural variations in B-form DNA over a wide range of sequences. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Software for Managing Personal Files.

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  3. High performance coated board inspection system based on commercial components

    Barjaktarovic, M; Radunovic, J

    2007-01-01

    This paper presents a vision system for defect (fault) detection on a coated board developed using three industrial firewire cameras and a PC. Application for image processing and system control was realized with the LabView software package. Software for defect detection is based on a variation of the image segmentation algorithm. Standard steps in image segmentation are modified to match the characteristics of defects. Software optimization was accomplished using SIMD (Single Instruction Multiple Data) technology available in the Intel Pentium 4 processors that provided real time inspection capability. System provides benefits such as: improvement in production process, higher quality of delivered coated board and reduction of waste. This was proven during successful exploitation of the system for more than a year.

  4. High performance carbon nanocomposites for ultracapacitors

    Lu, Wen

    2012-10-02

    The present invention relates to composite electrodes for electrochemical devices, particularly to carbon nanotube composite electrodes for high performance electrochemical devices, such as ultracapacitors.

  5. Packaging microservices

    Montesi, Fabrizio; Thrane, Dan Sebastian

    2017-01-01

    We describe a first proposal for a new packaging system for microservices based on the Jolie programming language, called the Jolie Package Manager (JPM). Its main features revolve around service interfaces, which make the functionalities that a service provides and depends on explicit. For the f......We describe a first proposal for a new packaging system for microservices based on the Jolie programming language, called the Jolie Package Manager (JPM). Its main features revolve around service interfaces, which make the functionalities that a service provides and depends on explicit...

  6. Top scientific research center deploys Zambeel Aztera (TM) network storage system in high performance environment

    2002-01-01

    " The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory has implemented a Zambeel Aztera storage system and software to accelerate the productivity of scientists running high performance scientific simulations and computations" (1 page).

  7. Rapid Prototyping of High Performance Signal Processing Applications

    Sane, Nimish

    Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high

  8. Behavioral Model of High Performance Camera for NIF Optics Inspection

    Hackel, B M

    2007-01-01

    The purpose of this project was to develop software that will model the behavior of the high performance Spectral Instruments 1000 series Charge-Coupled Device (CCD) camera located in the Final Optics Damage Inspection (FODI) system on the National Ignition Facility. NIF's target chamber will be mounted with 48 Final Optics Assemblies (FOAs) to convert the laser light from infrared to ultraviolet and focus it precisely on the target. Following a NIF shot, the optical components of each FOA must be carefully inspected for damage by the FODI to ensure proper laser performance during subsequent experiments. Rapid image capture and complex image processing (to locate damage sites) will reduce shot turnaround time; thus increasing the total number of experiments NIF can conduct during its 30 year lifetime. Development of these rapid processes necessitates extensive offline software automation -- especially after the device has been deployed in the facility. Without access to the unique real device or an exact behavioral model, offline software testing is difficult. Furthermore, a software-based behavioral model allows for many instances to be running concurrently; this allows multiple developers to test their software at the same time. Thus it is beneficial to construct separate software that will exactly mimic the behavior and response of the real SI-1000 camera

  9. Delivering high performance BWR fuel reliably

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  10. Selection of software for mechanical engineering undergraduates

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S.

    2016-01-01

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  11. Selection of software for mechanical engineering undergraduates

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S., E-mail: ablicblau@swin.edu.au [Swinburne University of Technology, Faculty of Science Engineering and Technology, PO Box 218 Hawthorn, Victoria, Australia, 3122 (Australia)

    2016-07-12

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  12. MVPACK: a package for the computer-aided design of multivariable control systems

    Mensah, S.

    1984-01-01

    The design and analysis of high performing controllers for large complex plants require a collection of interactive, powerful computer software. MVPACK, an open-ended package for the computer aided design of control systems has been developed in the Reactor Control Branch of the Chalk River Nuclear Laboratories. The package is fully interactive, and includes a comprehensive state-of-the-art mathematical library to support development of complex multivariable control algorithms. Coded in RATFOR, MVPACK operates with a flexible data structure which makes efficient use of minicomputer resources and provides a standard framework for program generation. The existence of a help mechanism enhances the simplicity of package utilization. This report provides the technical description of the package. It reviews the specifications used in the design and implementation of the package. The database structure, the supporting libraries and the design and analysis modules of MVPACK are described. The report includes several application examples to illustrate the capability of the package. Experience with MVPACK shows that the package provides a synergistic environment for control and regulation systems design, and that it is a unique tool in training of control system engineers

  13. Software for microcircuit systems

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  14. An approach to the analysis of SDSS spectroscopic outliers based on self-organizing maps. Designing the outlier analysis software package for the next Gaia survey

    Fustes, D.; Manteiga, M.; Dafonte, C.; Arcay, B.; Ulla, A.; Smith, K.; Borrachero, R.; Sordo, R.

    2013-11-01

    Aims: A new method applied to the segmentation and further analysis of the outliers resulting from the classification of astronomical objects in large databases is discussed. The method is being used in the framework of the Gaia satellite Data Processing and Analysis Consortium (DPAC) activities to prepare automated software tools that will be used to derive basic astrophysical information that is to be included in final Gaia archive. Methods: Our algorithm has been tested by means of simulated Gaia spectrophotometry, which is based on SDSS observations and theoretical spectral libraries covering a wide sample of astronomical objects. Self-organizing maps networks are used to organize the information in clusters of objects, as homogeneously as possible according to their spectral energy distributions, and to project them onto a 2D grid where the data structure can be visualized. Results: We demonstrate the usefulness of the method by analyzing the spectra that were rejected by the SDSS spectroscopic classification pipeline and thus classified as "UNKNOWN". First, our method can help distinguish between astrophysical objects and instrumental artifacts. Additionally, the application of our algorithm to SDSS objects of unknown nature has allowed us to identify classes of objects with similar astrophysical natures. In addition, the method allows for the potential discovery of hundreds of new objects, such as white dwarfs and quasars. Therefore, the proposed method is shown to be very promising for data exploration and knowledge discovery in very large astronomical databases, such as the archive from the upcoming Gaia mission.

  15. High-Performance Java Codes for Computational Fluid Dynamics

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  16. The Use of Utility Accounting Software at Miami University.

    Wenner, Paul

    1999-01-01

    Describes how Miami University successfully developed an accounting software package that tracked and recorded their utility usage, including examples of its graphics and reporting components. Background information examining the decision to pursue an energy management software package is included. (GR)

  17. Development of a Nevada Statewide Database for Safety Analyst Software

    2017-02-02

    Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...

  18. High-performance ceramics. Fabrication, structure, properties

    Petzow, G.; Tobolski, J.; Telle, R.

    1996-01-01

    The program ''Ceramic High-performance Materials'' pursued the objective to understand the chaining of cause and effect in the development of high-performance ceramics. This chain of problems begins with the chemical reactions for the production of powders, comprises the characterization, processing, shaping and compacting of powders, structural optimization, heat treatment, production and finishing, and leads to issues of materials testing and of a design appropriate to the material. The program ''Ceramic High-performance Materials'' has resulted in contributions to the understanding of fundamental interrelationships in terms of materials science, which are summarized in the present volume - broken down into eight special aspects. (orig./RHM)

  19. High Performance Grinding and Advanced Cutting Tools

    Jackson, Mark J

    2013-01-01

    High Performance Grinding and Advanced Cutting Tools discusses the fundamentals and advances in high performance grinding processes, and provides a complete overview of newly-developing areas in the field. Topics covered are grinding tool formulation and structure, grinding wheel design and conditioning and applications using high performance grinding wheels. Also included are heat treatment strategies for grinding tools, using grinding tools for high speed applications, laser-based and diamond dressing techniques, high-efficiency deep grinding, VIPER grinding, and new grinding wheels.

  20. Strategy Guideline: High Performance Residential Lighting

    Holton, J.

    2012-02-01

    The Strategy Guideline: High Performance Residential Lighting has been developed to provide a tool for the understanding and application of high performance lighting in the home. The high performance lighting strategies featured in this guide are drawn from recent advances in commercial lighting for application to typical spaces found in residential buildings. This guide offers strategies to greatly reduce lighting energy use through the application of high quality fluorescent and light emitting diode (LED) technologies. It is important to note that these strategies not only save energy in the home but also serve to satisfy the homeowner's expectations for high quality lighting.

  1. High-performance liquid chromatography coupled with tandem mass spectrometry technology in the analysis of Chinese Medicine Formulas: A bibliometric analysis (1997-2015).

    He, Xi-Ran; Li, Chun-Guang; Zhu, Xiao-Shu; Li, Yuan-Qing; Jarouche, Mariam; Bensoussan, Alan; Li, Ping-Ping

    2017-01-01

    There is a recognized challenge in analyzing traditional Chinese medicine formulas because of their complex chemical compositions. The application of modern analytical techniques such as high-performance liquid chromatography coupled with a tandem mass spectrometry has improved the characterization of various compounds from traditional Chinese medicine formulas significantly. This study aims to conduct a bibliometric analysis to recognize the overall trend of high-performance liquid chromatography coupled with tandem mass spectrometry approaches in the analysis of traditional Chinese medicine formulas, its significance and possible underlying interactions between individual herbs in these formulas. Electronic databases were searched systematically, and the identified studies were collected and analyzed using Microsoft Access 2010, Graph Pad 5.0 software and Ucinet software package. 338 publications between 1997 and 2015 were identified, and analyzed in terms of annual growth and accumulated publications, top journals, forms of traditional Chinese medicine preparations and highly studied formulas and single herbs, as well as social network analysis of single herbs. There is a significant increase trend in using high-performance liquid chromatography coupled with tandem mass spectrometry related techniques in analysis of commonly used forms of traditional Chinese medicine formulas in the last 3 years. Stringent quality control is of great significance for the modernization and globalization of traditional Chinese medicine, and this bibliometric analysis provided the first and comprehensive summary within this field. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Calculation Software versus Illustration Software for Teaching Statistics

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  3. Case Study of Using High Performance Commercial Processors in Space

    Ferguson, Roscoe C.; Olivas, Zulema

    2009-01-01

    The purpose of the Space Shuttle Cockpit Avionics Upgrade project (1999 2004) was to reduce crew workload and improve situational awareness. The upgrade was to augment the Shuttle avionics system with new hardware and software. A major success of this project was the validation of the hardware architecture and software design. This was significant because the project incorporated new technology and approaches for the development of human rated space software. An early version of this system was tested at the Johnson Space Center for one month by teams of astronauts. The results were positive, but NASA eventually cancelled the project towards the end of the development cycle. The goal to reduce crew workload and improve situational awareness resulted in the need for high performance Central Processing Units (CPUs). The choice of CPU selected was the PowerPC family, which is a reduced instruction set computer (RISC) known for its high performance. However, the requirement for radiation tolerance resulted in the re-evaluation of the selected family member of the PowerPC line. Radiation testing revealed that the original selected processor (PowerPC 7400) was too soft to meet mission objectives and an effort was established to perform trade studies and performance testing to determine a feasible candidate. At that time, the PowerPC RAD750s were radiation tolerant, but did not meet the required performance needs of the project. Thus, the final solution was to select the PowerPC 7455. This processor did not have a radiation tolerant version, but had some ability to detect failures. However, its cache tags did not provide parity and thus the project incorporated a software strategy to detect radiation failures. The strategy was to incorporate dual paths for software generating commands to the legacy Space Shuttle avionics to prevent failures due to the softness of the upgraded avionics.

  4. High performance liquid chromatographic determination of ...

    STORAGESEVER

    2010-02-08

    ) high performance liquid chromatography (HPLC) grade .... applications. These are important requirements if the reagent is to be applicable to on-line pre or post column derivatisation in a possible automation of the analytical.

  5. Analog circuit design designing high performance amplifiers

    Feucht, Dennis

    2010-01-01

    The third volume Designing High Performance Amplifiers applies the concepts from the first two volumes. It is an advanced treatment of amplifier design/analysis emphasizing both wideband and precision amplification.

  6. Strategies and Experiences Using High Performance Fortran

    Shires, Dale

    2001-01-01

    .... High performance Fortran (HPF) is a relative new addition to the Fortran dialect It is an attempt to provide an efficient high-level Fortran parallel programming language for the latest generation of been debatable...

  7. High-performance computing using FPGAs

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  8. Embedded High Performance Scalable Computing Systems

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  9. Gradient High Performance Liquid Chromatography Method ...

    Purpose: To develop a gradient high performance liquid chromatography (HPLC) method for the simultaneous determination of phenylephrine (PHE) and ibuprofen (IBU) in solid ..... nimesulide, phenylephrine. Hydrochloride, chlorpheniramine maleate and caffeine anhydrous in pharmaceutical dosage form. Acta Pol.

  10. The portability of the "Electronics Workbench" simulation software to China

    Collis, Betty; Zhi-Cheng, Dong

    1993-01-01

    This article discusses the portability of the Canadian-made simulation software package, "Electronic Workbench" package (EWB) to China. As part of a larger project investigating the portability of various educational software packages, the EWB package was used in electronics instruction in China and

  11. Hadoop-GIS: A High Performance Spatial Data Warehousing System over MapReduce.

    Aji, Ablimit; Wang, Fusheng; Vo, Hoang; Lee, Rubao; Liu, Qiaoling; Zhang, Xiaodong; Saltz, Joel

    2013-08-01

    Support of high performance queries on large volumes of spatial data becomes increasingly important in many application domains, including geospatial problems in numerous fields, location based services, and emerging scientific applications that are increasingly data- and compute-intensive. The emergence of massive scale spatial data is due to the proliferation of cost effective and ubiquitous positioning technologies, development of high resolution imaging technologies, and contribution from a large number of community users. There are two major challenges for managing and querying massive spatial data to support spatial queries: the explosion of spatial data, and the high computational complexity of spatial queries. In this paper, we present Hadoop-GIS - a scalable and high performance spatial data warehousing system for running large scale spatial queries on Hadoop. Hadoop-GIS supports multiple types of spatial queries on MapReduce through spatial partitioning, customizable spatial query engine RESQUE, implicit parallel spatial query execution on MapReduce, and effective methods for amending query results through handling boundary objects. Hadoop-GIS utilizes global partition indexing and customizable on demand local spatial indexing to achieve efficient query processing. Hadoop-GIS is integrated into Hive to support declarative spatial queries with an integrated architecture. Our experiments have demonstrated the high efficiency of Hadoop-GIS on query response and high scalability to run on commodity clusters. Our comparative experiments have showed that performance of Hadoop-GIS is on par with parallel SDBMS and outperforms SDBMS for compute-intensive queries. Hadoop-GIS is available as a set of library for processing spatial queries, and as an integrated software package in Hive.

  12. High performance computing in Windows Azure cloud

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  13. Carbon nanomaterials for high-performance supercapacitors

    Tao Chen; Liming Dai

    2013-01-01

    Owing to their high energy density and power density, supercapacitors exhibit great potential as high-performance energy sources for advanced technologies. Recently, carbon nanomaterials (especially, carbon nanotubes and graphene) have been widely investigated as effective electrodes in supercapacitors due to their high specific surface area, excellent electrical and mechanical properties. This article summarizes the recent progresses on the development of high-performance supercapacitors bas...

  14. Delivering high performance BWR fuel reliably

    Schardt, J.F. [GE Nuclear Energy, Wilmington, NC (United States)

    1998-07-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  15. HPTA: High-Performance Text Analytics

    Vandierendonck, Hans; Murphy, Karen; Arif, Mahwish; Nikolopoulos, Dimitrios S.

    2017-01-01

    One of the main targets of data analytics is unstructured data, which primarily involves textual data. High-performance processing of textual data is non-trivial. We present the HPTA library for high-performance text analytics. The library helps programmers to map textual data to a dense numeric representation, which can be handled more efficiently. HPTA encapsulates three performance optimizations: (i) efficient memory management for textual data, (ii) parallel computation on associative dat...

  16. Near-fault earthquake ground motion prediction by a high-performance spectral element numerical code

    Paolucci, Roberto; Stupazzini, Marco

    2008-01-01

    Near-fault effects have been widely recognised to produce specific features of earthquake ground motion, that cannot be reliably predicted by 1D seismic wave propagation modelling, used as a standard in engineering applications. These features may have a relevant impact on the structural response, especially in the nonlinear range, that is hard to predict and to be put in a design format, due to the scarcity of significant earthquake records and of reliable numerical simulations. In this contribution a pilot study is presented for the evaluation of seismic ground-motions in the near-fault region, based on a high-performance numerical code for 3D seismic wave propagation analyses, including the seismic fault, the wave propagation path and the near-surface geological or topographical irregularity. For this purpose, the software package GeoELSE is adopted, based on the spectral element method. The set-up of the numerical benchmark of 3D ground motion simulation in the valley of Grenoble (French Alps) is chosen to study the effect of the complex interaction between basin geometry and radiation mechanism on the variability of earthquake ground motion

  17. High-performance scientific computing in the cloud

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  18. High-performance commercial building systems

    Selkowitz, Stephen

    2003-10-01

    This report summarizes key technical accomplishments resulting from the three year PIER-funded R&D program, ''High Performance Commercial Building Systems'' (HPCBS). The program targets the commercial building sector in California, an end-use sector that accounts for about one-third of all California electricity consumption and an even larger fraction of peak demand, at a cost of over $10B/year. Commercial buildings also have a major impact on occupant health, comfort and productivity. Building design and operations practices that influence energy use are deeply engrained in a fragmented, risk-averse industry that is slow to change. Although California's aggressive standards efforts have resulted in new buildings designed to use less energy than those constructed 20 years ago, the actual savings realized are still well below technical and economic potentials. The broad goal of this program is to develop and deploy a set of energy-saving technologies, strategies, and techniques, and improve processes for designing, commissioning, and operating commercial buildings, while improving health, comfort, and performance of occupants, all in a manner consistent with sound economic investment practices. Results are to be broadly applicable to the commercial sector for different building sizes and types, e.g. offices and schools, for different classes of ownership, both public and private, and for owner-occupied as well as speculative buildings. The program aims to facilitate significant electricity use savings in the California commercial sector by 2015, while assuring that these savings are affordable and promote high quality indoor environments. The five linked technical program elements contain 14 projects with 41 distinct R&D tasks. Collectively they form a comprehensive Research, Development, and Demonstration (RD&D) program with the potential to capture large savings in the commercial building sector, providing significant economic benefits to

  19. High performance distributed objects in large hadron collider experiments

    Gutleber, J.

    1999-11-01

    This dissertation demonstrates how object-oriented technology can support the development of software that has to meet the requirements of high performance distributed data acquisition systems. The environment for this work is a system under planning for the Compact Muon Solenoid experiment at CERN that shall start its operation in the year 2005. The long operational phase of the experiment together with a tight and puzzling interaction with custom devices make the quest for an evolvable architecture that exhibits a high level of abstraction the driving issue. The question arises if an existing approach already fits our needs. The presented work casts light on these problems and as a result comprises the following novel contributions: - Application of object technology at hardware/software boundary. Software components at this level must be characterised by high efficiency and extensibility at the same time. - Identification of limitations when deploying commercial-off-the-shelf middleware for distributed object-oriented computing. - Capturing of software component properties in an efficiency model for ease of comparison and improvement. - Proof of feasibility that the encountered deficiencies in middleware can be avoided and that with the use of software components the imposed requirements can be met. - Design and implementation of an on-line software control system that allows to take into account the ever evolving requirements by avoiding hardwired policies. We conclude that state-of-the-art middleware cannot meet the required efficiency of the planned data acquisition system. Although new tool generations already provide a certain degree of configurability, the obligation to follow standards specifications does not allow the necessary optimisations. We identified the major limiting factors and argue that a custom solution following a component model with narrow interfaces can satisfy our requirements. This approach has been adopted for the current design

  20. High Performance Building Facade Solutions - PIER Final Project Report

    Lee, Eleanor; Selkowitz, Stephen

    2009-12-31

    Building facades directly influence heating and cooling loads and indirectly influence lighting loads when daylighting is considered, and are therefore a major determinant of annual energy use and peak electric demand. Facades also significantly influence occupant comfort and satisfaction, making the design optimization challenge more complex than many other building systems.This work focused on addressing significant near-term opportunities to reduce energy use in California commercial building stock by a) targeting voluntary, design-based opportunities derived from the use of better design guidelines and tools, and b) developing and deploying more efficient glazings, shading systems, daylighting systems, facade systems and integrated controls. This two-year project, supported by the California Energy Commission PIER program and the US Department of Energy, initiated a collaborative effort between The Lawrence Berkeley National Laboratory (LBNL) and major stakeholders in the facades industry to develop, evaluate, and accelerate market deployment of emerging, high-performance, integrated facade solutions. The LBNL Windows Testbed Facility acted as the primary catalyst and mediator on both sides of the building industry supply-user business transaction by a) aiding component suppliers to create and optimize cost effective, integrated systems that work, and b) demonstrating and verifying to the owner, designer, and specifier community that these integrated systems reliably deliver required energy performance. An industry consortium was initiated amongst approximately seventy disparate stakeholders, who unlike the HVAC or lighting industry, has no single representative, multi-disciplinary body or organized means of communicating and collaborating. The consortium provided guidance on the project and more importantly, began to mutually work out and agree on the goals, criteria, and pathways needed to attain the ambitious net zero energy goals defined by California and

  1. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially

  2. Achieving numerical accuracy and high performance using recursive tile LU factorization with partial pivoting

    Dongarra, Jack

    2013-09-18

    The LU factorization is an important numerical algorithm for solving systems of linear equations in science and engineering and is a characteristic of many dense linear algebra computations. For example, it has become the de facto numerical algorithm implemented within the LINPACK benchmark to rank the most powerful supercomputers in the world, collected by the TOP500 website. Multicore processors continue to present challenges to the development of fast and robust numerical software due to the increasing levels of hardware parallelism and widening gap between core and memory speeds. In this context, the difficulty in developing new algorithms for the scientific community resides in the combination of two goals: achieving high performance while maintaining the accuracy of the numerical algorithm. This paper proposes a new approach for computing the LU factorization in parallel on multicore architectures, which not only improves the overall performance but also sustains the numerical quality of the standard LU factorization algorithm with partial pivoting. While the update of the trailing submatrix is computationally intensive and highly parallel, the inherently problematic portion of the LU factorization is the panel factorization due to its memory-bound characteristic as well as the atomicity of selecting the appropriate pivots. Our approach uses a parallel fine-grained recursive formulation of the panel factorization step and implements the update of the trailing submatrix with the tile algorithm. Based on conflict-free partitioning of the data and lockless synchronization mechanisms, our implementation lets the overall computation flow naturally without contention. The dynamic runtime system called QUARK is then able to schedule tasks with heterogeneous granularities and to transparently introduce algorithmic lookahead. The performance results of our implementation are competitive compared to the currently available software packages and libraries. For example

  3. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  4. Achieving numerical accuracy and high performance using recursive tile LU factorization with partial pivoting

    Dongarra, Jack; Faverge, Mathieu; Ltaief, Hatem; Luszczek, Piotr R.

    2013-01-01

    The LU factorization is an important numerical algorithm for solving systems of linear equations in science and engineering and is a characteristic of many dense linear algebra computations. For example, it has become the de facto numerical algorithm implemented within the LINPACK benchmark to rank the most powerful supercomputers in the world, collected by the TOP500 website. Multicore processors continue to present challenges to the development of fast and robust numerical software due to the increasing levels of hardware parallelism and widening gap between core and memory speeds. In this context, the difficulty in developing new algorithms for the scientific community resides in the combination of two goals: achieving high performance while maintaining the accuracy of the numerical algorithm. This paper proposes a new approach for computing the LU factorization in parallel on multicore architectures, which not only improves the overall performance but also sustains the numerical quality of the standard LU factorization algorithm with partial pivoting. While the update of the trailing submatrix is computationally intensive and highly parallel, the inherently problematic portion of the LU factorization is the panel factorization due to its memory-bound characteristic as well as the atomicity of selecting the appropriate pivots. Our approach uses a parallel fine-grained recursive formulation of the panel factorization step and implements the update of the trailing submatrix with the tile algorithm. Based on conflict-free partitioning of the data and lockless synchronization mechanisms, our implementation lets the overall computation flow naturally without contention. The dynamic runtime system called QUARK is then able to schedule tasks with heterogeneous granularities and to transparently introduce algorithmic lookahead. The performance results of our implementation are competitive compared to the currently available software packages and libraries. For example

  5. Towards High Performance Processing In Modern Java Based Control Systems

    Misiowiec, M; Buttner, M

    2011-01-01

    CERN controls software is often developed on Java foundation. Some systems carry out a combination of data, network and processor intensive tasks within strict time limits. Hence, there is a demand for high performing, quasi real time solutions. Extensive prototyping of the new CERN monitoring and alarm software required us to address such expectations. The system must handle dozens of thousands of data samples every second, along its three tiers, applying complex computations throughout. To accomplish the goal, a deep understanding of multithreading, memory management and interprocess communication was required. There are unexpected traps hidden behind an excessive use of 64 bit memory or severe impact on the processing flow of modern garbage collectors. Tuning JVM configuration significantly affects the execution of the code. Even more important is the amount of threads and the data structures used between them. Accurately dividing work into independent tasks might boost system performance. Thorough profili...

  6. Power/energy use cases for high performance computing

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hammond, Steven [National Renewable Energy Lab. (NREL), Golden, CO (United States); Elmore, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munch, Kristin [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    Power and Energy have been identified as a first order challenge for future extreme scale high performance computing (HPC) systems. In practice the breakthroughs will need to be provided by the hardware vendors. But to make the best use of the solutions in an HPC environment, it will likely require periodic tuning by facility operators and software components. This document describes the actions and interactions needed to maximize power resources. It strives to cover the entire operational space in which an HPC system occupies. The descriptions are presented as formal use cases, as documented in the Unified Modeling Language Specification [1]. The document is intended to provide a common understanding to the HPC community of the necessary management and control capabilities. Assuming a common understanding can be achieved, the next step will be to develop a set of Application Programing Interfaces (APIs) to which hardware vendors and software developers could utilize to steer power consumption.

  7. Explicet: graphical user interface software for metadata-driven management, analysis and visualization of microbiome data.

    Robertson, Charles E; Harris, J Kirk; Wagner, Brandie D; Granger, David; Browne, Kathy; Tatem, Beth; Feazel, Leah M; Park, Kristin; Pace, Norman R; Frank, Daniel N

    2013-12-01

    Studies of the human microbiome, and microbial community ecology in general, have blossomed of late and are now a burgeoning source of exciting research findings. Along with the advent of next-generation sequencing platforms, which have dramatically increased the scope of microbiome-related projects, several high-performance sequence analysis pipelines (e.g. QIIME, MOTHUR, VAMPS) are now available to investigators for microbiome analysis. The subject of our manuscript, the graphical user interface-based Explicet software package, fills a previously unmet need for a robust, yet intuitive means of integrating the outputs of the software pipelines with user-specified metadata and then visualizing the combined data.

  8. Relevance of biotic pathways to the long-term regulation of nuclear waste disposal. Estimation of radiation dose to man resulting from biotic transport: the BIOPORT/MAXI1 software package. Volume 5

    McKenzie, D.H.; Cadwell, L.L.; Gano, K.A.; Kennedy, W.E. Jr.; Napier, B.A.; Peloquin, R.A.; Prohammer, L.A.; Simmons, M.A.

    1985-10-01

    BIOPORT/MAXI1 is a collection of five computer codes designed to estimate the potential magnitude of the radiation dose to man resulting from biotic transport processes. Dose to man is calculated for ingestion of agricultural crops grown in contaminated soil, inhalation of resuspended radionuclides, and direct exposure to penetrating radiation resulting from the radionuclide concentrations established in the available soil surface by the biotic transport model. This document is designed as both an instructional and reference document for the BIOPORT/MAXI1 computer software package and has been written for two major audiences. The first audience includes persons concerned with the mathematical models of biological transport of commercial low-level radioactive wastes and the computer algorithms used to implement those models. The second audience includes persons concerned with exercising the computer program and exposure scenarios to obtain results for specific applications. The report contains sections describing the mathematical models, user operation of the computer programs, and program structure. Input and output for five sample problems are included. In addition, listings of the computer programs, data libraries, and dose conversion factors are provided in appendices.

  9. Relevance of biotic pathways to the long-term regulation of nuclear waste disposal. Estimation of radiation dose to man resulting from biotic transport: the BIOPORT/MAXI1 software package. Volume 5

    McKenzie, D.H.; Cadwell, L.L.; Gano, K.A.; Kennedy, W.E. Jr.; Napier, B.A.; Peloquin, R.A.; Prohammer, L.A.; Simmons, M.A.

    1985-10-01

    BIOPORT/MAXI1 is a collection of five computer codes designed to estimate the potential magnitude of the radiation dose to man resulting from biotic transport processes. Dose to man is calculated for ingestion of agricultural crops grown in contaminated soil, inhalation of resuspended radionuclides, and direct exposure to penetrating radiation resulting from the radionuclide concentrations established in the available soil surface by the biotic transport model. This document is designed as both an instructional and reference document for the BIOPORT/MAXI1 computer software package and has been written for two major audiences. The first audience includes persons concerned with the mathematical models of biological transport of commercial low-level radioactive wastes and the computer algorithms used to implement those models. The second audience includes persons concerned with exercising the computer program and exposure scenarios to obtain results for specific applications. The report contains sections describing the mathematical models, user operation of the computer programs, and program structure. Input and output for five sample problems are included. In addition, listings of the computer programs, data libraries, and dose conversion factors are provided in appendices

  10. High performance bio-integrated devices

    Kim, Dae-Hyeong; Lee, Jongha; Park, Minjoon

    2014-06-01

    In recent years, personalized electronics for medical applications, particularly, have attracted much attention with the rise of smartphones because the coupling of such devices and smartphones enables the continuous health-monitoring in patients' daily life. Especially, it is expected that the high performance biomedical electronics integrated with the human body can open new opportunities in the ubiquitous healthcare. However, the mechanical and geometrical constraints inherent in all standard forms of high performance rigid wafer-based electronics raise unique integration challenges with biotic entities. Here, we describe materials and design constructs for high performance skin-mountable bio-integrated electronic devices, which incorporate arrays of single crystalline inorganic nanomembranes. The resulting electronic devices include flexible and stretchable electrophysiology electrodes and sensors coupled with active electronic components. These advances in bio-integrated systems create new directions in the personalized health monitoring and/or human-machine interfaces.

  11. Strategy Guideline. Partnering for High Performance Homes

    Prahl, Duncan [IBACOS, Inc., Pittsburgh, PA (United States)

    2013-01-01

    High performance houses require a high degree of coordination and have significant interdependencies between various systems in order to perform properly, meet customer expectations, and minimize risks for the builder. Responsibility for the key performance attributes is shared across the project team and can be well coordinated through advanced partnering strategies. For high performance homes, traditional partnerships need to be matured to the next level and be expanded to all members of the project team including trades, suppliers, manufacturers, HERS raters, designers, architects, and building officials as appropriate. This guide is intended for use by all parties associated in the design and construction of high performance homes. It serves as a starting point and features initial tools and resources for teams to collaborate to continually improve the energy efficiency and durability of new houses.

  12. Measure Guideline: Three High Performance Mineral Fiber Insulation Board Retrofit Solutions

    Neuhauser, Ken [Building Science Corporation, Westford, MA (United States)

    2015-01-01

    This Measure Guideline describes a high performance enclosure retrofit package that uses mineral fiber insulation board. The Measure Guideline describes retrofit assembly and details for wood frame roof and walls and for cast concrete foundations. This Measure Guideline is intended to serve contractors and designers seeking guidance for non-foam exterior insulation retrofit.

  13. Measure Guideline: Three High Performance Mineral Fiber Insulation Board Retrofit Solutions

    Neuhauser, K. [Building Science Corporation, Westford, MA (United States)

    2015-01-01

    This Measure Guideline describes a high performance enclosure retrofit package that uses mineral fiber insulation board, and is intended to serve contractors and designers seeking guidance for non-foam exterior insulation retrofit processes. The guideline describes retrofit assembly and details for wood frame roof and walls and for cast concrete foundations.

  14. High performance green barriers based on nanocellulose

    Sandeep S Nair; JY Zhu; Yulin Deng; Arthur J Ragauskas

    2014-01-01

    With the increasing environmental concerns such as sustainability and end-of-life disposal challenges, materials derived from renewable resources such as nanocellulose have been strongly advocated as potential replacements for packaging materials. Nanocellulose can be extracted from various plant resources through mechanical and chemical ways. Nanocellulose with its...

  15. RASPLAV package

    1990-01-01

    The RASPLAV package for investigation of post-accident mass transport and heat transfer processes is presented. The package performs three dimensional thermal conduction calculations in space nonuniform and temperature dependent conductivities and variable heat sources, taking into account phase transformations. The processes of free-moving bulk material, mixing of melting fuel due to advection and dissolution, and also evaporation/adsorption are modelled. Two-dimensional hydrodynamics with self-consistent heat transfer are also performed. The paper briefly traces the ways the solution procedures are carried out in the program package and outlines the major results of the simulation of reactor vessel melting after a core meltdown. The theoretical analysis and the calculations in this case were carried out in order to define the possibility of localization of the zone reminders. The interactions between the reminders and the concrete are simulated and evaluation of the interaction parameters is carried out. 4 refs. (R.Ts)

  16. The STARLINK software collection

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  17. Team Development for High Performance Management.

    Schermerhorn, John R., Jr.

    1986-01-01

    The author examines a team development approach to management that creates shared commitments to performance improvement by focusing the attention of managers on individual workers and their task accomplishments. It uses the "high-performance equation" to help managers confront shared beliefs and concerns about performance and develop realistic…

  18. An Introduction to High Performance Fortran

    John Merlin

    1995-01-01

    Full Text Available High Performance Fortran (HPF is an informal standard for extensions to Fortran 90 to assist its implementation on parallel architectures, particularly for data-parallel computation. Among other things, it includes directives for specifying data distribution across multiple memories, and concurrent execution features. This article provides a tutorial introduction to the main features of HPF.

  19. High Performance Work Systems for Online Education

    Contacos-Sawyer, Jonna; Revels, Mark; Ciampa, Mark

    2010-01-01

    The purpose of this paper is to identify the key elements of a High Performance Work System (HPWS) and explore the possibility of implementation in an online institution of higher learning. With the projected rapid growth of the demand for online education and its importance in post-secondary education, providing high quality curriculum, excellent…

  20. Debugging a high performance computing program

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  1. High Performance Networks for High Impact Science

    Scott, Mary A.; Bair, Raymond A.

    2003-02-13

    This workshop was the first major activity in developing a strategic plan for high-performance networking in the Office of Science. Held August 13 through 15, 2002, it brought together a selection of end users, especially representing the emerging, high-visibility initiatives, and network visionaries to identify opportunities and begin defining the path forward.

  2. Teacher Accountability at High Performing Charter Schools

    Aguirre, Moises G.

    2016-01-01

    This study will examine the teacher accountability and evaluation policies and practices at three high performing charter schools located in San Diego County, California. Charter schools are exempted from many laws, rules, and regulations that apply to traditional school systems. By examining the teacher accountability systems at high performing…

  3. Technology Leadership in Malaysia's High Performance School

    Yieng, Wong Ai; Daud, Khadijah Binti

    2017-01-01

    Headmaster as leader of the school also plays a role as a technology leader. This applies to the high performance schools (HPS) headmaster as well. The HPS excel in all aspects of education. In this study, researcher is interested in examining the role of the headmaster as a technology leader through interviews with three headmasters of high…

  4. Toward High Performance in Industrial Refrigeration Systems

    Thybo, C.; Izadi-Zamanabadi, Roozbeh; Niemann, H.

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  5. Towards high performance in industrial refrigeration systems

    Thybo, C.; Izadi-Zamanabadi, R.; Niemann, Hans Henrik

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  6. Validated high performance liquid chromatographic (HPLC) method ...

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  7. Validated High Performance Liquid Chromatography Method for ...

    Purpose: To develop a simple, rapid and sensitive high performance liquid ... response, tailing factor and resolution of six replicate injections was < 3 %. ... Cefadroxil monohydrate, Human plasma, Pharmacokinetics Bioequivalence ... Drug-free plasma was obtained from the local .... Influence of probenicid on the renal.

  8. High-performance OPCPA laser system

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J.

    2006-01-01

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  9. High-performance OPCPA laser system

    Zuegel, J.D.; Bagnoud, V.; Bromage, J.; Begishev, I.A.; Puth, J. [Rochester Univ., Lab. for Laser Energetics, NY (United States)

    2006-06-15

    Optical parametric chirped-pulse amplification (OPCPA) is ideally suited for amplifying ultra-fast laser pulses since it provides broadband gain across a wide range of wavelengths without many of the disadvantages of regenerative amplification. A high-performance OPCPA system has been demonstrated as a prototype for the front end of the OMEGA Extended Performance (EP) Laser System. (authors)

  10. Comparing Dutch and British high performing managers

    Waal, A.A. de; Heijden, B.I.J.M. van der; Selvarajah, C.; Meyer, D.

    2016-01-01

    National cultures have a strong influence on the performance of organizations and should be taken into account when studying the traits of high performing managers. At the same time, many studies that focus upon the attributes of successful managers show that there are attributes that are similar

  11. Project materials [Commercial High Performance Buildings Project

    None

    2001-01-01

    The Consortium for High Performance Buildings (ChiPB) is an outgrowth of DOE'S Commercial Whole Buildings Roadmapping initiatives. It is a team-driven public/private partnership that seeks to enable and demonstrate the benefit of buildings that are designed, built and operated to be energy efficient, environmentally sustainable, superior quality, and cost effective.

  12. High performance structural ceramics for nuclear industry

    Pujari, Vimal K.; Faker, Paul

    2006-01-01

    A family of Saint-Gobain structural ceramic materials and products produced by its High performance Refractory Division is described. Over the last fifty years or so, Saint-Gobain has been a leader in developing non oxide ceramic based novel materials, processes and products for application in Nuclear, Chemical, Automotive, Defense and Mining industries

  13. A new high performance current transducer

    Tang Lijun; Lu Songlin; Li Deming

    2003-01-01

    A DC-100 kHz current transducer is developed using a new technique on zero-flux detecting principle. It was shown that the new current transducer is of high performance, its magnetic core need not be selected very stringently, and it is easy to manufacture

  14. Design of High Performance Permanent-Magnet Synchronous Wind Generators

    Chun-Yu Hsiao

    2014-11-01

    Full Text Available This paper is devoted to the analysis and design of high performance permanent-magnet synchronous wind generators (PSWGs. A systematic and sequential methodology for the design of PMSGs is proposed with a high performance wind generator as a design model. Aiming at high induced voltage, low harmonic distortion as well as high generator efficiency, optimal generator parameters such as pole-arc to pole-pitch ratio and stator-slot-shoes dimension, etc. are determined with the proposed technique using Maxwell 2-D, Matlab software and the Taguchi method. The proposed double three-phase and six-phase winding configurations, which consist of six windings in the stator, can provide evenly distributed current for versatile applications regarding the voltage and current demands for practical consideration. Specifically, windings are connected in series to increase the output voltage at low wind speed, and in parallel during high wind speed to generate electricity even when either one winding fails, thereby enhancing the reliability as well. A PMSG is designed and implemented based on the proposed method. When the simulation is performed with a 6 Ω load, the output power for the double three-phase winding and six-phase winding are correspondingly 10.64 and 11.13 kW. In addition, 24 Ω load experiments show that the efficiencies of double three-phase winding and six-phase winding are 96.56% and 98.54%, respectively, verifying the proposed high performance operation.

  15. Studies on high performance Timeslice building on the CBM FLES

    Hartmann, Helvi [Frankfurt Institute for Advanced Studies, Goethe University, Frankfurt (Germany); Collaboration: CBM-Collaboration

    2015-07-01

    In contrast to already existing high energy physics experiments the Compressed Baryonic Matter (CBM) experiment collects all data untriggered. The First-level Event Selector (FLES), which denotes a high performance computer cluster, processes the very high incoming data rate of 1 TByte/s and performs a full online event reconstruction. For this task it needs to access the raw detector data in time intervals referred to as Timeslices. In order to construct the Timeslices, the FLES Timeslice building has to combine data from all input links and distribute them via a high-performance network to the compute nodes. For fast data transfer the Infiniband network has proven to be appropriate. One option to address the network is using Infiniband (RDMA) Verbs directly and potentially making best use of Infiniband. However, it is a very low-level implementation relying on the hardware and neglecting other possible network technologies in the future. Another approach is to apply a high-level API like MPI which is independent of the underlying hardware and suitable for less error prone software development. I present the given possibilities and show the results of benchmarks ran on high-performance computing clusters. The solutions are evaluated regarding the Timeslice building in CBM.

  16. Quantum Accelerators for High-performance Computing Systems

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  17. AN ADA NAMELIST PACKAGE

    Klumpp, A. R.

    1994-01-01

    The Ada Namelist Package, developed for the Ada programming language, enables a calling program to read and write FORTRAN-style namelist files. A namelist file consists of any number of assignment statements in any order. Features of the Ada Namelist Package are: the handling of any combination of user-defined types; the ability to read vectors, matrices, and slices of vectors and matrices; the handling of mismatches between variables in the namelist file and those in the programmed list of namelist variables; and the ability to avoid searching the entire input file for each variable. The principle user benefits of this software are the following: the ability to write namelist-readable files, the ability to detect most file errors in the initialization phase, a package organization that reduces the number of instantiated units to a few packages rather than to many subprograms, a reduced number of restrictions, and an increased execution speed. The Ada Namelist reads data from an input file into variables declared within a user program. It then writes data from the user program to an output file, printer, or display. The input file contains a sequence of assignment statements in arbitrary order. The output is in namelist-readable form. There is a one-to-one correspondence between namelist I/O statements executed in the user program and variables read or written. Nevertheless, in the input file, mismatches are allowed between assignment statements in the file and the namelist read procedure statements in the user program. The Ada Namelist Package itself is non-generic. However, it has a group of nested generic packages following the nongeneric opening portion. The opening portion declares a variety of useraccessible constants, variables and subprograms. The subprograms are procedures for initializing namelists for reading, reading and writing strings. The subprograms are also functions for analyzing the content of the current dataset and diagnosing errors. Two nested

  18. Radiation cured coatings for high performance products

    Parkins, J.C.; Teesdale, D.H.

    1984-01-01

    Development over the past ten years of radiation curable coating and lacquer systems and the means of curing them has led to new products in the packaging, flooring, furniture and other industries. Solventless lacquer systems formulated with acrylates and other resins enable high levels of durability, scuff resistance and gloss to be achieved. Ultra violet and electron beam radiation curing are used, the choice depending on the nature of the coating, the product and the scale of the operation. (author)

  19. Petroleum software profiles

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  20. Gammasphere software development. Progress report

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.