WorldWideScience

Sample records for analysis system tool

  1. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  2. Two energy system analysis - tools

    DEFF Research Database (Denmark)

    Lund, Henrik; Andersen, Anders N.; Antonoff, Jayson

    2004-01-01

    The chapter illustrates some of the different types of problems that must be solved when analysing energy systems.......The chapter illustrates some of the different types of problems that must be solved when analysing energy systems....

  3. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  4. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  5. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    . The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur.......This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...

  6. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  7. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  8. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  9. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  10. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  11. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  12. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  13. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...

  14. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  15. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  16. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  17. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  18. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.

    1975-09-01

    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  19. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  20. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  1. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    The FCR tables and stakeholder feedback are then used as the foundation of a Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis . Finally...the SWOT analysis and stakeholder feedback arc translated into an EASE future development strategy; a series of recommendations regarding...and Threats ( SWOT ) analysis . Finally, the SWOT analysis and stakeholder feedback are translated into an EASE future development strategy; a series

  2. FADES: A tool for automated fault analysis of complex systems

    International Nuclear Information System (INIS)

    Wood, C.

    1990-01-01

    FADES is an Expert System for performing fault analyses on complex connected systems. By using a graphical editor to draw components and link them together the FADES system allows the analyst to describe a given system. The knowledge base created is used to qualitatively simulate the system behaviour. By inducing all possible component failures in the system and determining their effects, a set of facts is built up. These facts are then used to create Fault Trees, or FMEA tables. The facts may also be used for explanation effects and to generate diagnostic rules allowing system instrumentation to be optimised. The prototype system has been built and tested and is preently undergoing testing by users. All comments from these trials will be used to tailor the system to the requirements of the user so that the end product performs the exact task required

  3. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.; Lester, P.B.

    1998-05-01

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  4. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    Science.gov (United States)

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  5. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  6. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  7. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  8. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  9. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Energy Technology Data Exchange (ETDEWEB)

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering

    2006-06-15

    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  10. System analysis: A tool for determining the feasibility of PACS for radiology

    International Nuclear Information System (INIS)

    Parrish, D.M.; Thompson, B.G.; Creasy, J.L.; Wallace, R.J.

    1987-01-01

    In the emerging technology of picture archival and communication systems, the real productivity improvements that such a system may provide are being evaluated by a systems analysis tool. This computer model allows a simulated comparison of manual versus digital departmental functions, by allowing changes of operational parameter times, physical distances in the department, and equipment and manpower resources; examples are presented. The presentation will focus on the analysis approach, the operational parameters most important in the digital environment, and an analysis of the potential productivity improvements

  11. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  12. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  13. Analysis of Java Client/Server and Web Programming Tools for Development of Educational Systems.

    Science.gov (United States)

    Muldner, Tomasz

    This paper provides an analysis of old and new programming tools for development of client/server programs, particularly World Wide Web-based programs. The focus is on development of educational systems that use interactive shared workspaces to provide portable and expandable solutions. The paper begins with a short description of relevant terms.…

  14. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  15. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    Science.gov (United States)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  16. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    Science.gov (United States)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  17. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    Science.gov (United States)

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark

    2011-01-01

    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  18. A Computational Tool for Comparative Energy Cost Analysis of Multiple-Crop Production Systems

    Directory of Open Access Journals (Sweden)

    Efthymios Rodias

    2017-06-01

    Full Text Available Various crops can be considered as potential bioenergy and biofuel production feedstocks. The selection of the crops to be cultivated for that purpose is based on several factors. For an objective comparison between different crops, a common framework is required to assess their economic or energetic performance. In this paper, a computational tool for the energy cost evaluation of multiple-crop production systems is presented. All the in-field and transport operations are considered, providing a detailed analysis of the energy requirements of the components that contribute to the overall energy consumption. A demonstration scenario is also described. The scenario is based on three selected energy crops, namely Miscanthus, Arundo donax and Switchgrass. The tool can be used as a decision support system for the evaluation of different agronomical practices (such as fertilization and agrochemicals application, machinery systems, and management practices that can be applied in each one of the individual crops within the production system.

  19. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  20. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  1. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    Science.gov (United States)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  2. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    Science.gov (United States)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  3. Analysis of the Thermal Characteristics of Machine Tool Feed System Based on Finite Element Method

    Science.gov (United States)

    Mao, Xiaobo; Mao, Kuanmin; Du, Yikang; Wang, Fengyun; Yan, Bo

    2017-09-01

    The loading of mobile heat source and boundary conditions setting are difficult problems in the analysis of thermal characteristics of machine tools. Taking the machine tool feed system as an example, a novel method for loading of mobile heat source was proposed by establishing a function which was constructed by the heat source and time. The convective heat transfer coefficient is the key parameter of boundary conditions, and it varies with the temperature. In this paper, a model of “variable convection heat transfer coefficient” was proposed, and the setting of boundary conditions of thermal analysis was closer to the real situation. Finally, comparing results of above method and experimental data, the accuracy and validity of this method was proved, meanwhile, the simulation calculation and simulation time was reducing greatly.

  4. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    Science.gov (United States)

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  5. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  6. Online Detection of Impending Instability in a Combustion System Using Tools from Symbolic Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Vishnu R. Unni

    2015-09-01

    Full Text Available In this paper, we introduce a novel technique (anomaly detection for the online detection of impending instability in a combustion system based on symbolic time series analysis. The experimental results presented in this paper illustrate the application of anomaly detection to a combustor in which the flame is stabilized either by a bluff body or by a swirler. The detection unit works on the principle that in the transition region from combustion noise to thermoacoustic instability, combustion systems exhibit peculiar dynamics which results in the formation of specific patterns in the time series. Further, tools from symbolic time series analysis is used to recognize these patterns and then define an anomaly measure indicative of the proximity of system to regimes of thermoacoustic instability.

  7. Pickering tool management system

    International Nuclear Information System (INIS)

    Wong, E.H.; Green, A.H.

    1997-01-01

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  8. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one

  9. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    Science.gov (United States)

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  10. The integrated microbial genomes (IMG) system in 2007: datacontent and analysis tool extensions

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Szeto, Ernest; Palaniappan, Krishna; Grechkin, Yuri; Chu, Ken; Chen, I-Min A.; Dubchak, Inna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2007-08-01

    The Integrated Microbial Genomes (IMG) system is a data management, analysis and annotation platform for all publicly available genomes. IMG contains both draft and complete JGI microbial genomes integrated with all other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and annotating genomes, genes and functions, individually or in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through quarterly releases. IMG is provided by the DOE-Joint Genome Institute (JGI) and is available from http://img.jgi.doe.gov.

  11. Instantaneous Purified Orbit: A New Tool for Analysis of Nonstationary Vibration of Rotor System

    Directory of Open Access Journals (Sweden)

    Shi Dongfeng

    2001-01-01

    Full Text Available In some circumstances, vibration signals of large rotating machinery possess time-varying characteristics to some extent. Traditional diagnosis methods, such as FFT spectrum and orbit diagram, are confronted with a huge challenge to deal with this problem. This work aims at studying the four intrinsic drawbacks of conventional vibration signal processing method and instantaneous purified orbit (IPO on the basis of improved Fourier spectrum (IFS to analyze nonstationary vibration. On account of integration, the benefits of short period Fourier transform (SPFT and regular holospectrum, this method can intuitively reflect vibration characteristics of’a rotor system by means of parameter analysis for corresponding frequency ellipses. Practical examples, such as transient vibration in run-up stages and bistable condition of rotor show that IPO is a powerful tool for diagnosis and analysis of the vibration behavior of rotor systems.

  12. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities. The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.

  13. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Dizzy-Beats: a Bayesian evidence analysis tool for systems biology.

    Science.gov (United States)

    Aitken, Stuart; Kilpatrick, Alastair M; Akman, Ozgur E

    2015-06-01

    Model selection and parameter inference are complex problems of long-standing interest in systems biology. Selecting between competing models arises commonly as underlying biochemical mechanisms are often not fully known, hence alternative models must be considered. Parameter inference yields important information on the extent to which the data and the model constrain parameter values. We report Dizzy-Beats, a graphical Java Bayesian evidence analysis tool implementing nested sampling - an algorithm yielding an estimate of the log of the Bayesian evidence Z and the moments of model parameters, thus addressing two outstanding challenges in systems modelling. A likelihood function based on the L1-norm is adopted as it is generically applicable to replicated time series data. http://sourceforge.net/p/bayesevidence/home/Home/. © The Author 2015. Published by Oxford University Press.

  15. Design and testing of data analysis tool for ECRH systems in labview

    International Nuclear Information System (INIS)

    Patel, Jatinkumar; Patel, H.; Purohit, D.; Rajanbabu, N.; Mistry, H.; Shukla, B.K.

    2017-01-01

    Electron Cyclotron Resonance Heating (ECRH) is one of the essential RF heating sub systems used for pre-ionization and current drive experiments in SST1. In SST-1, Gyrotron based two ECRH systems are installed. 42 GHz gyrotron system capable of delivering 500kW RF power for 500ms and the 82.6 GHz gyrotron capable of delivering 200kW continuously for 1000 second. VME based Data acquisition and control (DAC) system is installed with the gyrotron systems and is under operation with SST-1. This paper explains the basic features of PXI DAC and the data analysis tool designed in labview 2014 and the problems faced during design and testing of different lengths of data plot. It has various features of plotting data in different categories i.e. plotting of fast acquisition signals data to decide the sequence of gyrotron interlocks happened during the gyrotron operations, conversion of main binary file into 32 .text files and all the 32 analog signals data plot in a single plot, auto update of selected important signals plot at the end of the shot

  16. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  17. Users Guide for Fire Image Analysis System - Version 5.0: A Tool for Measuring Fire Behavior Characteristics

    Science.gov (United States)

    Carl W. Adkins

    1995-01-01

    The Fire Image Analysis System is a tool for quantifying flame geometry and relative position at selected points along a spreading line fire. At present, the system requires uniform terrain (constant slope). The system has been used in field and laboratory studies for determining flame length, depth, cross sectional area, and rate of spread.

  18. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  19. Analysis tools for simulation of hybrid systems; Herramientas de analisis para simulacion de sistemas hibridos

    Energy Technology Data Exchange (ETDEWEB)

    Guillen S, Omar; Mejia N, Fortino [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2005-07-01

    In order to facilitate and to simplify the development and analysis of a Hybrid System in reference to its design, construction, operation and maintenance, it turns out optimal to carry out the simulation of this one by means of software, with which a significant reduction in the investment costs is obtained. Given the mix of technology of electrical generation which is involved in a hybrid system, it is very important to have a tool integrated with specialized packages of calculation (software), that allow to carry out the simulation tasks of the operational functioning of these systems. Combined with the former, one must not fail to consider the operation characteristics, the facilities of the user, the clarity in the obtained results and the possibility of its validation with respect to prototypes orchestrated in field. Equally, it is necessary to consider the identification of tasks involved in relation to the place of installation of this electrification technology. At the moment, the hybrid systems technology still is in a stage of development in the international level, and exist important limitations as far as the methodology availability and engineering tools for the optimum design of these systems. With the development of this paper, it is intended to contribute to the advance of the technology and to count on own tools to solve the described series of problems. In this article are described the activities that more impact have in the design and development of hybrid systems, as well as the identification of variables, basic characteristics and form of validation of tools in the integration of a methodology for the simulation of these systems, facilitating their design and development. [Spanish] Para facilitar y simplificar el desarrollo y analisis de un Sistema Hibrido en lo que refiere a su diseno, construccion, operacion y mantenimiento, resulta optimo efectuar la simulacion de este por medio de un software, con lo que se obtiene una reduccien

  20. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    Science.gov (United States)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency

  1. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  2. A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study

    Science.gov (United States)

    Mukhopadhyay, Vivek

    2003-01-01

    An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.

  3. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  4. Transportation routing analysis geographic information system -- TRAGIS, a multimodal transportation routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.

    1995-01-01

    Over 15 years ago, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Subsequent modifications have been made to enhance each of these models. Some of these changes include population density information for routes, HM-164 routing regulations for highway route controlled quantities of radioactive materials (RAM) truck shipments, and inclusion of waterway routing into INTERLINE. The AIRPORT model, developed 2 years after the HIGHWAY and INTERLINE models, serves as an emergency response tool. This model identifies the nearest airports from a designated location. Currently, the AIRPORT model is inactive. The Transportation Management Division of the US Department of Energy held a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models in April 1994 to bring together many users of these models and other experts in the transportation routing field to discuss these models and to decide on the capabilities that needed to be added. Of the many needs discussed, the primary one was to have the network databases within a geographic information system (GIS). As a result of the Baseline Requirements Session, the development of a new GIS model has been initiated. This paper will discuss the development of the new Transportation Routing Analysis GIS (TRAGIS) model at ORNL

  5. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  6. Overview of the Systems Analysis Framework for the EU Bioeconomy. Deliverable 1.4 of the EU FP 7 SAT-BBE project Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy (SAT BBE)

    NARCIS (Netherlands)

    Leeuwen, van M.G.A.; Meijl, van H.; Smeets, E.M.W.; Tabeau-Kowalska, E.W.

    2014-01-01

    In November 2012 the Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy project (SAT-BBE) was launched with the purpose to design an analysis tool useful to monitoring the evolution and impacts of the bioeconomy. In the SAT-BBE project the development of the analysis tool for the

  7. Addressing the alarm analysis barrier - a tool for improving alarm systems

    International Nuclear Information System (INIS)

    Davey, E.C.; Basso, R.A.; Feher, M.P.

    1995-01-01

    This paper describes a software application tool for the initial specification and maintenance of the thousands of alarms in nuclear and other process control plants. The software program is used by system designers and maintainers to analyze, characterize, record and maintain the alarm information and configuration decisions for an alarm system. The tool provides a comprehensive design and information handling environment for: the existing alarm functions in current CANDU plants; the new alarm processing and presentation concepts developed under CANDU Owners Group (COG) sponsorship that are available to be applied to existing CANDU plants on a retrofit basis; and, the alarm functions to be implemented in new CANDU plants. (author). 3 refs., 1 fig

  8. Motion analysis systems as optimization training tools in combat sports and martial arts

    Directory of Open Access Journals (Sweden)

    Ewa Polak

    2016-01-01

    Full Text Available Introduction: Over the past years, a few review papers about possibilities of using motion analysis systems in sport were published, but there are no articles that discuss this problem in the field of combat sports and martial arts. Aim: This study presents the diversity of contemporary motion analysis systems both, those that are used in scientific research, as well as those that can be applied in daily work of coaches and athletes in combat sports and martial arts. An additional aim is the indication of example applications in scientific research and range of applications in optimizing the training process. It presents a brief description of each type of systems that are currently used in sport, specific examples of systems and the main advantages and disadvantages of using them. The presentation and discussion takes place in the following sections: motion analysis utility for combat sports and martial arts, systems using digital video and systems using markers, sensors or transmitters. Conclusions: Not all types of motion analysis systems used in sport are suitable for combat sports and martial arts. Scientific studies conducted so far showed the usefulness of video-based, optical and electromechanical systems. The use of research results made with complex motion analysis systems, or made with simple systems, local application and immediate visualization is important for the preparation of training and its optimization. It may lead to technical and tactical improvement in athletes as well as the prevention of injuries in combat sports and martial arts.

  9. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  10. Bifurcation Tools for Flight Dynamics Analysis and Control System Design, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Modern bifurcation analysis methods have been proposed for investigating flight dynamics and control system design in highly nonlinear regimes and also for the...

  11. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  12. Analysis of appraisal tool of system security engineering capability maturity based on component

    International Nuclear Information System (INIS)

    Liu Zhenghai; Yang Xiaohua; Zou Shuliang; Liu Yachun; Xiao Jiantian; Liu Zhiming

    2012-01-01

    Spent Fuel Reprocessing is a part of nuclear fuel cycle and is the inevitably choice of nuclear power sustainable development. Reprocessing needs to face with radiological, criticality, chemical hazards. Besides using the tradition appraisal methods based on the security goals, it is a beneficial supplement that using the appraisal method of system security engineering capability maturity model based on the process. Experts should check and approve large numbers of documents during the appraisal based on system security engineering capability maturity model, so it is necessary that developing a tool to assist the expert to complete the appraisal. The method of developing software based on component is highly effective, nimble and reliable. Component technology is analyzed, the methods of extraction model domain components and general components is introduced, and the appraisal system is developed based on component technology. (authors)

  13. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  14. Ergonomic work analysis as a tool of prevention for the occupational safety and health management system.

    Science.gov (United States)

    de Miranda Prottes, Verônica; Oliveira, Nádia Cristina; de Oliveira Andrade, Alessandra Barbosa

    2012-01-01

    This paper introduces the Ergonomic Work Analysis as a relevant instrument to identify the risks in occupational environments through the investigation of factors that influence the relationship between the worker and the productive process. It draws a parallel between the several aspects of risk identification in traditional tools of Health and Safety Management and the factors embraced by the Ergonomic Work Analysis, showing that the ergonomic methodology is able to go deeper in the scenarios of possible incident causes. This deepening enables the establishment of a relationship between the work context and the upcoming damage to the physical integrity of the worker. It acts as a complementary instrument in the traditional approach to the risk management. In order to explain the application of this methodology in a preventive way, it is presented a case study of a coal mill inspector in a siderurgic company.

  15. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  16. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel

    2014-01-01

    , analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...... that structure the inter-connections between these sites. Including contributions from a range of academic disciplines including Political Science, Media and Communication Studies, Economics, and Computer Science, this study showcases a new methodological approach that has been expressly designed to capture......As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract...

  17. MOSES – A modelling tool for the analysis of scenarios of the European electricity supply system

    Directory of Open Access Journals (Sweden)

    Agert C.

    2012-10-01

    Full Text Available Recent studies have shown that a transition of the current power supply system in Europe to a system almost entirely based on fluctuating Renewable Energy Sources (RES by mid-century is possible. However, most of these scenarios require a significant amount of back-up power capacities to ensure the security of electricity supply. This would imply high additional investments and operating costs. Hence, alternative options should be investigated first. Here we present a first outlook of our simulation model MOSES which will be able to analyse different target states of the European electricity system in 2050. In this model long-term meteorological data series are used to optimise the capacity mix of RES in Europe. One of the main elements of our tool is a simplified electricity network. In addition, alternative options for reduction of additional back-up power like the expansion of the transmission grid, the use of demand-side management and/or the installation of over-capacities will be implemented. The results will be used to provide scientifically proven recommendations to policy makers for a reliable energy supply system in Europe based on Renewable Energy Sources.

  18. Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System

    Science.gov (United States)

    Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.

    2010-01-01

    The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.

  19. Safety analysis and review system: a Department of Energy safety assurance tool

    International Nuclear Information System (INIS)

    Rosenthal, H.B.

    1981-01-01

    The concept of the Safety Analysis and Review System is not new. It has been used within the Department and its predecessor agencies, Atomic Energy Commission (AEC) and Energy Research and Development Administration (ERDA), for over 20 years. To minimize the risks from nuclear reactor and power plants, the AEC developed a process to support management authorization of each operation through identification and analysis of potential hazards and the measures taken to control them. As the agency evolved from AEC through ERDA to the Department of Energy, its responsibilities were broadened to cover a diversity of technologies, including those associated with the development of fossil, solar, and geothermal energy. Because the safety analysis process had proved effective in a technology of high potential hazard, the Department investigated the applicability of the process to the other technologies. This paper describes the system and discusses how it is implemented within the Department

  20. Material Flow Analysis as a Tool to improve Waste Management Systems: The Case of Austria.

    Science.gov (United States)

    Allesch, Astrid; Brunner, Paul H

    2017-01-03

    This paper demonstrates the power of material flow analysis (MFA) for designing waste management (WM) systems and for supporting decisions with regards to given environmental and resource goals. Based on a comprehensive case study of a nationwide WM-system, advantages and drawbacks of a mass balance approach are discussed. Using the software STAN, a material flow system comprising all relevant inputs, stocks and outputs of wastes, products, residues, and emissions is established and quantified. Material balances on the level of goods and selected substances (C, Cd, Cr, Cu, Fe, Hg, N, Ni, P, Pb, Zn) are developed to characterize this WM-system. The MFA results serve well as a base for further assessments. Based on given goals, stakeholders engaged in this study selected the following seven criteria for evaluating their WM-system: (i) waste input into the system, (ii) export of waste (iii) gaseous emissions from waste treatment plants, (iv) long-term gaseous and liquid emissions from landfills, (v) waste being recycled, (vi) waste for energy recovery, (vii) total waste landfilled. By scenario analysis, strengths and weaknesses of different measures were identified. The results reveal the benefits of a mass balance approach due to redundancy, data consistency, and transparency for optimization, design, and decision making in WM.

  1. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  2. State-of-the-art Tools and Techniques for Quantitative Modeling and Analysis of Embedded Systems

    DEFF Research Database (Denmark)

    Bozga, Marius; David, Alexandre; Hartmanns, Arnd

    2012-01-01

    This paper surveys well-established/recent tools and techniques developed for the design of rigorous embedded sys- tems. We will first survey U PPAAL and M ODEST, two tools capable of dealing with both timed and stochastic aspects. Then, we will overview the BIP framework for modular design...

  3. Ground Data System Analysis Tools to Track Flight System State Parameters for the Mars Science Laboratory (MSL) and Beyond

    Science.gov (United States)

    Allard, Dan; Deforrest, Lloyd

    2014-01-01

    Flight software parameters enable space mission operators fine-tuned control over flight system configurations, enabling rapid and dynamic changes to ongoing science activities in a much more flexible manner than can be accomplished with (otherwise broadly used) configuration file based approaches. The Mars Science Laboratory (MSL), Curiosity, makes extensive use of parameters to support complex, daily activities via commanded changes to said parameters in memory. However, as the loss of Mars Global Surveyor (MGS) in 2006 demonstrated, flight system management by parameters brings with it risks, including the possibility of losing track of the flight system configuration and the threat of invalid command executions. To mitigate this risk a growing number of missions have funded efforts to implement parameter tracking parameter state software tools and services including MSL and the Soil Moisture Active Passive (SMAP) mission. This paper will discuss the engineering challenges and resulting software architecture of MSL's onboard parameter state tracking software and discuss the road forward to make parameter management tools suitable for use on multiple missions.

  4. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  5. Climate Data Analysis Tools - (CDAT)

    Science.gov (United States)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware

  6. VALIDERING AV VERKTYGET "ENTERPRISE ARCHITECTURE ANALYSIS TOOL"

    OpenAIRE

    Österlind, Magnus

    2011-01-01

    The Enterprise Architecture Analysis Tool, EAAT, is a software tool developed by the department of Industrial Information- and Control systems, ICS, at the Royal Institute of Technology, Stockholm, Sweden. EAAT is a modeling tool that combines Enterprise Architecture (EA) modeling with probabilistic relational modeling. Therefore EAAT makes it possible to design, describe and analyze the organizational structure, business processes, information systems and infrastructure within an enterprise....

  7. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  8. The System Cost Model: A tool for life cycle cost and risk analysis

    International Nuclear Information System (INIS)

    Hsu, K.; Lundeen, A.; Shropshire, D.; Sherick, M.

    1996-01-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors began development of the System Cost Model (SCM) application. The SCM estimates life cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, and transuranic waste. The SCM uses parametric cost functions to estimate life cycle costs for various treatment, storage, and disposal modules which reflect planned and existing waste management facilities at DOE installations. In addition, SCM can model new TSD facilities based on capacity needs over the program life cycle. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction, operations and maintenance, and decommissioning these waste management facilities. The SCM also provides transportation costs for DOE wastes. Transportation costs are provided for truck and rail and include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. A complement to the SCM is the System Cost Model-Risk (SCM-R) model, which provides relative Environmental, Safety, and Health (ES and H) risk information. A relative ES and H risk basis has been developed and applied by LITCO at the INEL. The risk basis is now being automated in the SCM-R to facilitate rapid risk analysis of system alternatives. The added risk functionality will allow combined cost and risk evaluation of EM alternatives

  9. Dynamic analysis and vibration testing of CFRP drive-line system used in heavy-duty machine tool

    Directory of Open Access Journals (Sweden)

    Mo Yang

    2018-03-01

    Full Text Available Low critical rotary speed and large vibration in the metal drive-line system of heavy-duty machine tool affect the machining precision seriously. Replacing metal drive-line with the CFRP drive-line can effectively solve this problem. Based on the composite laminated theory and the transfer matrix method (TMM, this paper puts forward a modified TMM to analyze dynamic characteristics of CFRP drive-line system. With this modified TMM, the CFRP drive-line of a heavy vertical miller is analyzed. And the finite element modal analysis model of the shafting is established. The results of the modified TMM and finite element analysis (FEA show that the modified TMM can effectively predict the critical rotary speed of CFRP drive-line. And the critical rotary speed of CFRP drive-line is 20% higher than that of the original metal drive-line. Then, the vibration of the CFRP and the metal drive-line were tested. The test results show that application of the CFRP drive shaft in the drive-line can effectively reduce the vibration of the heavy-duty machine tool. Keywords: CFRP drive-line system, Dynamic behavior, Transfer matrix, Vibration measurement

  10. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  11. Dynamic analysis and vibration testing of CFRP drive-line system used in heavy-duty machine tool

    Science.gov (United States)

    Yang, Mo; Gui, Lin; Hu, Yefa; Ding, Guoping; Song, Chunsheng

    2018-03-01

    Low critical rotary speed and large vibration in the metal drive-line system of heavy-duty machine tool affect the machining precision seriously. Replacing metal drive-line with the CFRP drive-line can effectively solve this problem. Based on the composite laminated theory and the transfer matrix method (TMM), this paper puts forward a modified TMM to analyze dynamic characteristics of CFRP drive-line system. With this modified TMM, the CFRP drive-line of a heavy vertical miller is analyzed. And the finite element modal analysis model of the shafting is established. The results of the modified TMM and finite element analysis (FEA) show that the modified TMM can effectively predict the critical rotary speed of CFRP drive-line. And the critical rotary speed of CFRP drive-line is 20% higher than that of the original metal drive-line. Then, the vibration of the CFRP and the metal drive-line were tested. The test results show that application of the CFRP drive shaft in the drive-line can effectively reduce the vibration of the heavy-duty machine tool.

  12. Integrating Reliability Analysis with a Performance Tool

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  13. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  14. Bottom-Up Energy Analysis System (BUENAS). An international appliance efficiency policy tool

    Energy Technology Data Exchange (ETDEWEB)

    McNeil, M.A.; Letschert, V.E.; De la Rue du Can, S.; Ke, Jing [Lawrence Berkeley National Laboratory LBNL, 1 Cyclotron Rd, Berkeley, CA (United States)

    2013-02-15

    The Bottom-Up Energy Analysis System (BUENAS) calculates potential energy and greenhouse gas emission impacts of efficiency policies for lighting, heating, ventilation, and air conditioning, appliances, and industrial equipment through 2030. The model includes 16 end use categories and covers 11 individual countries plus the European Union. BUENAS is a bottom-up stock accounting model that predicts energy consumption for each type of equipment in each country according to engineering-based estimates of annual unit energy consumption, scaled by projections of equipment stock. Energy demand in each scenario is determined by equipment stock, usage, intensity, and efficiency. When available, BUENAS uses sales forecasts taken from country studies to project equipment stock. Otherwise, BUENAS uses an econometric model of household appliance uptake developed by the authors. Once the business as usual scenario is established, a high-efficiency policy scenario is constructed that includes an improvement in the efficiency of equipment installed in 2015 or later. Policy case efficiency targets represent current 'best practice' and include standards already established in a major economy or well-defined levels known to enjoy a significant market share in a major economy. BUENAS calculates energy savings according to the difference in energy demand in the two scenarios. Greenhouse gas emission mitigation is then calculated using a forecast of electricity carbon factor. We find that mitigation of 1075 mt annual CO2 emissions is possible by 2030 from adopting current best practices of appliance efficiency policies. This represents a 17 % reduction in emissions in the business as usual case in that year.

  15. OCAM - A CELSS modeling tool: Description and results. [Object-oriented Controlled Ecological Life Support System Analysis and Modeling

    Science.gov (United States)

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray

    1992-01-01

    Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.

  16. A New Approach in Applying Systems Engineering Tools and Analysis to Determine Hepatocyte Toxicogenomics Risk Levels to Human Health.

    Science.gov (United States)

    Gigrich, James; Sarkani, Shahryar; Holzer, Thomas

    2017-03-01

    There is an increasing backlog of potentially toxic compounds that cannot be evaluated with current animal-based approaches in a cost-effective and expeditious manner, thus putting human health at risk. Extrapolation of animal-based test results for human risk assessment often leads to different physiological outcomes. This article introduces the use of quantitative tools and methods from systems engineering to evaluate the risk of toxic compounds by the analysis of the amount of stress that human hepatocytes undergo in vitro when metabolizing GW7647 1 over extended times and concentrations. Hepatocytes are exceedingly connected systems that make it challenging to understand the highly varied dimensional genomics data to determine risk of exposure. Gene expression data of peroxisome proliferator-activated receptor-α (PPARα) 2 binding was measured over multiple concentrations and varied times of GW7647 exposure and leveraging mahalanombis distance to establish toxicity threshold risk levels. The application of these novel systems engineering tools provides new insight into the intricate workings of human hepatocytes to determine risk threshold levels from exposure. This approach is beneficial to decision makers and scientists, and it can help reduce the backlog of untested chemical compounds due to the high cost and inefficiency of animal-based models.

  17. A Multi-disciplinary Tool for Space Launch Systems Propulsion Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An accurate predictive capability of coupled fluid-structure interaction in propulsion system is crucial in the development of NASA's new Space Launch System (SLS)....

  18. Passivity and Dissipativity as Design and Analysis Tools for Networked Control Systems

    Science.gov (United States)

    Yu, Han

    2012-01-01

    In this dissertation, several control problems are studied that arise when passive or dissipative systems are interconnected and controlled over a communication network. Since communication networks can impact the systems' stability and performance, there is a need to extend the results on control of passive or dissipative systems to networked…

  19. Data Analysis of Heating Systems for Buildings—A Tool for Energy Planning, Policies and Systems Simulation

    Directory of Open Access Journals (Sweden)

    Michel Noussan

    2018-01-01

    Full Text Available Heating and cooling in buildings is a central aspect for adopting energy efficiency measures and implementing local policies for energy planning. The knowledge of features and performance of those existing systems is fundamental to conceiving realistic energy savings strategies. Thanks to Information and Communication Technologies (ICT development and energy regulations’ progress, the amount of data able to be collected and processed allows detailed analyses on entire regions or even countries. However, big data need to be handled through proper analyses, to identify and highlight the main trends by selecting the most significant information. To do so, careful attention must be paid to data collection and preprocessing, for ensuring the coherence of the associated analyses and the accuracy of results and discussion. This work presents an insightful analysis on building heating systems of the most populated Italian region—Lombardy. From a dataset of almost 2.9 million of heating systems, selected reference values are presented, aiming at describing the features of current heating systems in households, offices and public buildings. Several aspects are considered, including the type of heating systems, their thermal power, fuels, age, nominal and measured efficiency. The results of this work can be a support for local energy planners and policy makers, and for a more accurate simulation of existing energy systems in buildings.

  20. Design and Analysis Tools for Deployable Solar Array Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  1. Bifurcation Tools for Flight Dynamics Analysis and Control System Design, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the project is the development of a computational package for bifurcation analysis and advanced flight control of aircraft. The development of...

  2. Design and Analysis Tools for Deployable Solar Array Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  3. Slide system for machine tools

    Science.gov (United States)

    Douglass, S.S.; Green, W.L.

    1980-06-12

    The present invention relates to a machine tool which permits the machining of nonaxisymmetric surfaces on a workpiece while rotating the workpiece about a central axis of rotation. The machine tool comprises a conventional two-slide system (X-Y) with one of these slides being provided with a relatively short travel high-speed auxiliary slide which carries the material-removing tool. The auxiliary slide is synchronized with the spindle speed and the position of the other two slides and provides a high-speed reciprocating motion required for the displacement of the cutting tool for generating a nonaxisymmetric surface at a selected location on the workpiece.

  4. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems

    International Nuclear Information System (INIS)

    Hummel, J. R.; Silver, R. B.

    2002-01-01

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives

  5. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    DEFF Research Database (Denmark)

    Clavreul, Julie

    Since the late 1990s, life cycle assessment (LCA) has been increasingly applied to waste management to quantify direct, indirect and avoided impacts from various treatment options. The construction of inventories for waste management systems differs from classical product-LCAs in that (1) these s......Since the late 1990s, life cycle assessment (LCA) has been increasingly applied to waste management to quantify direct, indirect and avoided impacts from various treatment options. The construction of inventories for waste management systems differs from classical product-LCAs in that (1...

  6. An interactive visualization tool for the analysis of multi-objective embedded systems design space exploration

    NARCIS (Netherlands)

    Taghavi, T.; Pimentel, A.D.

    2011-01-01

    The design of today’s embedded systems involves a complex Design Space Exploration (DSE) process. Typically, multiple and conflicting criteria (objectives) should be optimized simultaneously such as performance, power, cost, etc. Usually, Multi-Objective Evolutionary Algorithms (MOEAs) are used to

  7. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  8. Famine Early Warning Systems Network (FEWS NET) Agro-climatology Analysis Tools and Knowledge Base Products for Food Security Applications

    Science.gov (United States)

    Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.

    2017-12-01

    The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge

  9. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  10. Systemic Assessment as a New Tool for Assessing Students ...

    African Journals Online (AJOL)

    Systemic Assessment as a New Tool for Assessing Students Learning in Chemistry using SATL Methods: Systemic Matching, Systemic Synthesis, Systemic Analysis, Systemic Synthetic – Analytic, as Systemic Question Types.

  11. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  12. Exploration Medical System Trade Study Tools Overview

    Science.gov (United States)

    Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.

    2018-01-01

    ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.

  13. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  14. Tools for Detecting Causality in Space Systems

    Science.gov (United States)

    Johnson, J.; Wing, S.

    2017-12-01

    Complex systems such as the solar and magnetospheric envivonment often exhibit patterns of behavior that suggest underlying organizing principles. Causality is a key organizing principle that is particularly difficult to establish in strongly coupled nonlinear systems, but essential for understanding and modeling the behavior of systems. While traditional methods of time-series analysis can identify linear correlations, they do not adequately quantify the distinction between causal and coincidental dependence. We discuss tools for detecting causality including: granger causality, transfer entropy, conditional redundancy, and convergent cross maps. The tools are illustrated by applications to magnetospheric and solar physics including radiation belt, Dst (a magnetospheric state variable), substorm, and solar cycle dynamics.

  15. Development Methodology of a Cyber Security Risk Analysis and Assessment Tool for Digital I and C Systems in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Cha, K. H.; Lee, C. K.; Song, J. G.; Lee, Y. J.; Kim, J. Y.; Lee, J. W.; Lee, D. Y.

    2011-01-01

    With the use of digital computers and communication networks the hot issues on cyber security were raised about 10 years ago. The scope of cyber security application has now been extended from the safety Instrumentation and Control (I and C) system to safety important systems, plant security system, and emergency preparedness system. Therefore, cyber security should be assessed and managed systematically throughout the development life cycle of I and C systems in order for their digital assets to be protected from cyber attacks. Fig. 1 shows the concept of a cyber security risk management of digital I and C systems in nuclear power plants (NPPs). A lot of cyber security risk assessment methods, techniques, and supported tools have been developed for Information Technology (IT) systems, but they have not been utilized widely for cyber security risk assessments of the digital I and C systems in NPPs. The main reason is a difference in goals between IT systems and nuclear I and C systems. Confidentiality is important in IT systems, but availability and integrity are important in nuclear I and C systems. Last year, it was started to develop a software tool to be specialized for the development process of nuclear I and C systems. This paper presents a development methodology of the Cyber Security Risk analysis and Assessment Tool (CSRAT) for the digital I and C systems in NPP

  16. Performance analysis of GYRO: a tool evaluation

    International Nuclear Information System (INIS)

    Worley, P; Candy, J; Carrington, L; Huck, K; Kaiser, T; Mahinthakumar, G; Malony, A; Moore, S; Reed, D; Roth, P; Shan, H; Shende, S; Snavely, A; Sreepathi, S; Wolf, F; Zhang, Y

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  17. A Simulation of Energy Storage System for Improving the Power System Stability with Grid-Connected PV using MCA Analysis and LabVIEW Tool

    Directory of Open Access Journals (Sweden)

    Jindrich Stuchly

    2015-01-01

    Full Text Available The large-scale penetration of distributed, Renewable power plants require transfers of large amounts of energy. This, in turn, puts a high strain on the energy delivery infrastructure. In particular, photovoltaic power plants supply energy with high intermittency, possibly affecting the stability of the grid by changing the voltage at the plant connection point. In this contribution, we summarize the main negative effects of selected and real-operated grid connected photovoltaic plant. Thereafter a review of suitable Energy storage systems to mitigate the negative effects has been carried out, compared and evaluated using Multi-criterion analysis. Based on this analysis, data collected at the plant and the grid, are used to design the energy storage systems to support connection of the plant to the grid. The cooperation of these systems is then analysed and evaluated using simulation tools created in LabVIEW for this purpose. The simulation results demonstrate the capability of energy storage system solutions to significantly reduce the negative feedback effects of Photovoltaic Power Plan to the low voltage grid.

  18. User's Guide and Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information

    Science.gov (United States)

    ABSTRACTUser’s Guide & Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information(EPA/601/B-15/001, 2015, 123 pages)Henry Lee II, U.S. EPA, Western Ecology DivisionKatharine Marko, U.S. EPA,...

  19. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  20. Wavelets as a tool for power system dynamic events analysis – State-of-the-art and future applications

    Directory of Open Access Journals (Sweden)

    Samir Avdaković

    2015-05-01

    Full Text Available Power system is constantly exposed to disturbances of varying intensity and cascading propagation of disturbance that can lead to partial or total power system blackout. Some of the consequences are network topology changes, outage of generator units or occurrence of low-frequency electromechanical oscillations (LFEO. One of the techniques for signal analysis and processing used in the past 20 years for power system disturbance analysis, and helped in better understanding of power system dynamic, is Wavelet Transform (WT. This work reviews WT applications for power system dynamic behavior analysis. With research made on vast literature it can be concluded that WT technique has different applications in disturbances identification and localization, LFEO identification and analysis, and assessment of active power imbalance.

  1. Flow Injection/Sequential Injection Analysis Systems: Potential Use as Tools for Rapid Liver Diseases Biomarker Study

    Directory of Open Access Journals (Sweden)

    Supaporn Kradtap Hartwell

    2012-01-01

    Full Text Available Flow injection/sequential injection analysis (FIA/SIA systems are suitable for carrying out automatic wet chemical/biochemical reactions with reduced volume and time consumption. Various parts of the system such as pump, valve, and reactor may be built or adapted from available materials. Therefore the systems can be at lower cost as compared to other instrumentation-based analysis systems. Their applications for determination of biomarkers for liver diseases have been demonstrated in various formats of operation but only a few and limited types of biomarkers have been used as model analytes. This paper summarizes these applications for different types of reactions as a guide for using flow-based systems in more biomarker and/or multibiomarker studies.

  2. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  3. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices...... of system-level performance for seven different network configurations illustrating past conditions (2004-2008) and future scenarios (2015 and 2020). These indices allowed comparing the WMS with 24 human systems and 12 ecosystems. The results of the NA reveal that the WMS differs from ecosystems...

  4. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  5. Functional analysis, a resilience improvement tool applied to a waste management system – application to the "household waste management chain"

    Directory of Open Access Journals (Sweden)

    H. Beraud

    2012-12-01

    Full Text Available A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site.


    1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.. These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005.

  6. Functional analysis, a resilience improvement tool applied to a waste management system - application to the "household waste management chain"

    Science.gov (United States)

    Beraud, H.; Barroca, B.; Hubert, G.

    2012-12-01

    A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site. 1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.). These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005).

  7. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  8. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  9. Quality assurance procedures based on dosimetric, gamma analysis as a fast reliable tool for commissioning brachytherapy treatment planning systems

    Directory of Open Access Journals (Sweden)

    Zwierzchowski Grzegorz

    2017-11-01

    Full Text Available Fast and easily repeatable methods for commissioning procedures for brachytherapy (BT treatment planning systems (TPS are needed. Radiochromic film dosimetry with gamma analysis is widely used in external beam quality assurance (QA procedures and planar film dosimetry is also increasingly used for verification of the dose distribution in BT applications. Using the gamma analysis method for comparing calculated and measured dose data could be used for commissioning procedures of the newly developed TG-186 and MBDCA calculation algorithms. The aim of this study was dosimetric verification of the calculation algorithm used in TPS when the CT/MRI ring applicator is used.

  10. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  11. A systematic and practical method for selecting systems engineering tools

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2017-01-01

    . The tremendous number of available tools makes it difficult to get an overview and identify the best choice. Selecting wrong tools due to inappropriate analysis can have severe impact on the success of the company. This paper presents a systematic method for selecting systems engineering tools based on thorough...

  12. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  13. Remote Sensing Image Analysis Without Expert Knowledge - A Web-Based Classification Tool On Top of Taverna Workflow Management System

    Science.gov (United States)

    Selsam, Peter; Schwartze, Christian

    2016-10-01

    Providing software solutions via internet has been known for quite some time and is now an increasing trend marketed as "software as a service". A lot of business units accept the new methods and streamlined IT strategies by offering web-based infrastructures for external software usage - but geospatial applications featuring very specialized services or functionalities on demand are still rare. Originally applied in desktop environments, the ILMSimage tool for remote sensing image analysis and classification was modified in its communicating structures and enabled for running on a high-power server and benefiting from Tavema software. On top, a GIS-like and web-based user interface guides the user through the different steps in ILMSimage. ILMSimage combines object oriented image segmentation with pattern recognition features. Basic image elements form a construction set to model for large image objects with diverse and complex appearance. There is no need for the user to set up detailed object definitions. Training is done by delineating one or more typical examples (templates) of the desired object using a simple vector polygon. The template can be large and does not need to be homogeneous. The template is completely independent from the segmentation. The object definition is done completely by the software.

  14. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  15. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  16. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  17. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  18. STUDENT PERFORMANCE PARAMETER ANALYSIS USING SPSS TOOL

    OpenAIRE

    Bhavesh Patel; Jyotindra Dharwa

    2017-01-01

    SPSS tool is statistical analysis tool. This tool is used for analyzing the large volume of available data, extracting useful information and knowledge to support the major decision-making processes. SPSS tool can be applied in educational sector for improving the performance of students by finding the highly affected parameter on student performance. This research study is carried out by collecting the student performance parameters and its related dataset. In this research study we have col...

  19. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    Science.gov (United States)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  20. Probabilistic Safety Assessment: An Effective Tool to Support “Systemic Approach” to Nuclear Safety and Analysis of Human and Organizational Aspects

    International Nuclear Information System (INIS)

    Kuzmina, I.

    2016-01-01

    The Probabilistic Safety Assessment (PSA) represents a comprehensive conceptual and analytical tool for quantitative evaluation of risk of undesirable consequences from nuclear facilities and drawing on qualitative insights for nuclear safety. PSA considers various technical, human, and organizational factors in an integral manner thus explicitly pursuing a true ‘systemic approach’ to safety and enabling holistic insights for further safety improvement. Human Reliability Analysis (HRA) is one of the major tasks within PSA. The poster paper provides an overview of the objectives and scope of PSA and HRA and discusses on further needs in the area of HRA. (author)

  1. An image processing and analysis tool for identifying and analysing complex plant root systems in 3D soil using non-destructive analysis: Root1.

    Science.gov (United States)

    Flavel, Richard J; Guppy, Chris N; Rabbi, Sheikh M R; Young, Iain M

    2017-01-01

    The objective of this study was to develop a flexible and free image processing and analysis solution, based on the Public Domain ImageJ platform, for the segmentation and analysis of complex biological plant root systems in soil from x-ray tomography 3D images. Contrasting root architectures from wheat, barley and chickpea root systems were grown in soil and scanned using a high resolution micro-tomography system. A macro (Root1) was developed that reliably identified with good to high accuracy complex root systems (10% overestimation for chickpea, 1% underestimation for wheat, 8% underestimation for barley) and provided analysis of root length and angle. In-built flexibility allowed the user interaction to (a) amend any aspect of the macro to account for specific user preferences, and (b) take account of computational limitations of the platform. The platform is free, flexible and accurate in analysing root system metrics.

  2. Safety Implications Concerning Usage of Tools in Complex System

    OpenAIRE

    Augusto, Rafael; Silva, Nuno

    2016-01-01

    International audience; Integration of tools and configuration data is nowadays present in all railway systems and plays a central role in functionality, flexibility and the safety of railway systems. This paper aims to present the challenges and the importance of tools, the configuration data integrity and the toolchain definition in the design of railway systems safety. We focus on the relevant implications on the safety analysis and safety assurance of such systems. Two examples of the usa...

  3. Geographical information system (GIS) as a new tool to evaluate epidemiology based on spatial analysis and clinical outcomes in acromegaly.

    Science.gov (United States)

    Naves, Luciana Ansaneli; Porto, Lara Benigno; Rosa, João Willy Corrêa; Casulari, Luiz Augusto; Rosa, José Wilson Corrêa

    2015-02-01

    Geographical information systems (GIS) have emerged as a group of innovative software components useful for projects in epidemiology and planning in Health Care System. This is an original study to investigate environmental and geographical influences on epidemiology of acromegaly in Brazil. We aimed to validate a method to link an acromegaly registry with a GIS mapping program, to describe the spatial distribution of patients, to identify disease clusters and to evaluate if the access to Health Care could influence the outcome of the disease. Clinical data from 112 consecutive patients were collected and home addresses were plotted in the GIS software for spatial analysis. The buffer spatial distribution of patients living in Brasilia showed that 38.1% lived from 0.33 to 8.66 km, 17.7% from 8.67 to 18.06 km, 22.2% from 18.07 to 25.67 km and 22% from 25.68 to 36.70 km distant to the Reference Medical Center (RMC), and no unexpected clusters were identified. Migration of 26 patients from 11 others cities in different regions of the country was observed. Most of patients (64%) with adenomas bigger than 25 mm lived more than 20 km away from RMC, but no significant correlation between the distance from patient's home to the RMC and tumor diameter (r = 0.45 p = 0.20) nor for delay in diagnosis (r = 0.43 p = 0.30) was found. The geographical distribution of diagnosed cases did not impact in the latency of diagnosis or tumor size but the recognition of significant migration denotes that improvements in the medical assistance network are needed.

  4. General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure, and...

  5. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    Science.gov (United States)

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  6. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  7. Failure Modes and Effects Analysis (FMEA) Assistant Tool

    Data.gov (United States)

    National Aeronautics and Space Administration — The FMEA Assistant tool offers a new and unique approach to assist hardware developers and safety analysts perform failure analysis by using model based systems...

  8. SEAT: A strategic engagement analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.; Michelsen, C.; Morgeson, D.

    1988-01-01

    The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

  9. Hybrid wind–photovoltaic–diesel–battery system sizing tool development using empirical approach, life-cycle cost and performance analysis: A case study in Scotland

    International Nuclear Information System (INIS)

    Gan, Leong Kit; Shek, Jonathan K.H.; Mueller, Markus A.

    2015-01-01

    Highlights: • Methods of sizing a hybrid wind–photovoltaic–diesel–battery system is described. • The hybrid system components are modelled using empirical data. • Twenty years lifecycle cost of the hybrid system is considered. • The trade-offs between battery storage capacity and diesel fuel usage is studied. • A hybrid system sizing tool has been developed as a graphical user interface (GUI). - Abstract: The concept of off-grid hybrid wind energy system is financially attractive and more reliable than stand-alone power systems since it is based on more than one electricity generation source. One of the most expensive components in a stand-alone wind-power system is the energy storage system as very often it is oversized to increase system autonomy. In this work, we consider a hybrid system which consists of wind turbines, photovoltaic panels, diesel generator and battery storage. One of the main challenges experienced by project managers is the sizing of components for different sites. This challenge is due to the variability of the renewable energy resource and the load demand for different sites. This paper introduces a sizing model that has been developed and implemented as a graphical user interface, which predicts the optimum configuration of a hybrid system. In particular, this paper focuses on seeking the optimal size of the batteries and the diesel generator usage. Both of these components are seen to be trade-offs from each other. The model simulates real time operation of the hybrid system, using the annual measured hourly wind speed and solar irradiation. The benefit of using time series approach is that it reflects a more realistic situation; here, the peaks and troughs of the renewable energy resource are a central part of the sizing model. Finally, load sensitivity and hybrid system performance analysis are demonstrated.

  10. ECOLOGICAL GEOINFORMATION SYSTEMS AS A TOOL FOR ANALYSIS OF SUSTAINABLE DEVELOPMENT OF THE TERRITORY OF KIZILYURT CITY

    Directory of Open Access Journals (Sweden)

    N. O. Guseynova

    2014-01-01

    Full Text Available Astract. Aim. Research of ecological and geochemical pollution of soils of Kizilyurt city with use of GIS for an assessment of an ecological condition of an urban area as indicator of a sustainable development. Methods. Investigation in field conditions and by methods of chemical analyses of tests of soil samples. Laboratory and analytical researches are executed by using of the standard techniques on the nuclear and absorbing spectrophotometer. Mathematical and statistical methods are applied. Ecological maps and relational Databases are constructed by using of GIS-technologies. Results. The analysis of results of urban soils on the content of heavy metals showed that the greatest contribution to complex pollution of soils of the city is made by lead. The raised content of lead in soils of the city proves the fact of that the soil is the long-term concentrator of pollyutants. Distribution of lead pollution speaks from positions of intensity of an automobile stream. Data on the content of lead in soils of an urban soils indicate essential dispersion of indicators of a mobile form (from 18 mg/kg to 100 mg/kg. Excess is observed not only in comparison with the background contents, also repeatedly (from 3rd to 16 times indicators of maximum permissible concentration of lead are exceeded. Zc values vary from values 7,2 to 12,1 in various points of the city depending on an arrangement in relation to highways or the industrial enterprises. Main conclusions. On a estimating scale of pollution danger of Kizilyurt soil treat category "moderately dangerous" and "dangerous" pollution by heavy metals. The ecological assessment of city soils shows an adverse and dangerous situation for population health. The relational Database of environmental monitoring of natural and technogenic systems on the example of Kizilyurt city is created. According to an ecological and geochemical assessment digital geoinformation cartographical models of a condition of an

  11. Expert systems as decision tools

    International Nuclear Information System (INIS)

    Scott, C.K.

    1989-01-01

    The feasibility of using expert systems as an aid in regulatory compliance functions has been investigated. A literature review was carried out to identify applications of expert systems to regulatory affairs. A bibliography of the small literature on such applications was prepared. A prototype system, ARIES, was developed to demonstrate the use of an expert system as an aid to a Project Officer in assuring compliance with licence requirements. The system runs on a personal computer with a graphical interface. Extensive use is made of hypertext to link interrelated rules and requirements as well as to provide an explanation facility. Based on the performance of ARIES the development of a field version is recommended

  12. Rotary fast tool servo system and methods

    Energy Technology Data Exchange (ETDEWEB)

    Montesanti, Richard C [Cambridge, MA; Trumper, David L [Plaistow, NH; Kirtley, Jr., James L. (Brookline, MA)

    2009-08-18

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. One or more position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  13. Rotary fast tool servo system and methods

    Energy Technology Data Exchange (ETDEWEB)

    Montesanti, Richard C. (Cambridge, MA); Trumper, David L. (Plaistow, NH)

    2007-10-02

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. A pair of position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  14. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  15. Airborne LIDAR Data Processing and Analysis Tools

    Science.gov (United States)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  16. Automated design tools for biophotonic systems

    Science.gov (United States)

    Vacca, Giacomo; Lehtimäki, Hannu; Karras, Tapio; Murphy, Sean

    2014-03-01

    Traditional design methods for flow cytometers and other complex biophotonic systems are increasingly recognized as a major bottleneck in instrumentation development. The many manual steps involved in the analysis and translation of the design, from optical layout to a detailed mechanical model and ultimately to a fully functional instrument, are laborintensive and prone to wasteful trial-and-error iterations. We have developed two complementary, linked technologies that address this problem: one design tool (LiveIdeas™) provides an intuitive environment for interactive, real-time simulations of system-level performance; the other tool (BeamWise™) automates the generation of mechanical 3D CAD models based on those simulations. The strength of our approach lies in a parametric modeling strategy that breaks boundaries between engineering subsystems (e.g., optics and fluidics) to predict critical behavior of the instrument as a whole. The results: 70 percent reduction in early-stage project effort, significantly enhancing the probability of success by virtue of a more efficient exploration of the design space.

  17. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Science.gov (United States)

    2012-01-01

    Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis

  18. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  19. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  20. Analytical Tools for Affordability Analysis

    Science.gov (United States)

    2015-05-01

    Portion of O&M Procurement + RDT&E Portfolios New System Total Lifecycle Costs (total reserved profile) Recommended Submission Formats (DAG) 5...Web application  SQL Server database  Migrating to D3 graphics from Google API  Data from multiple (conflicting) sources, organized by...2,200 884 1,479 1,917 456 2,200 2,135 2,112 $B POM14–18 F Transportation (Procurement + RDT&E) New System Total Lifecycle Costs (total reserved

  1. Structural Health Monitoring System Trade Space Analysis Tool with Consideration for Crack Growth, Sensor Degradation and a Variable Detection Threshold

    Science.gov (United States)

    2014-09-18

    affect the rotation and translation of the POD curve was pressed by Pado et al. (2013). In an SHM system using piezoelectric sensors (PZT) using pitch ...Smart Materials and Structures, 10, 432 . Buderath, M. (2004). Review the process of integrating SHM systems into condition based maintenance as

  2. The second iteration of the Systems Prioritization Method: A systems prioritization and decision-aiding tool for the Waste Isolation Pilot Plant: Volume 3, Analysis for final programmatic recommendations

    International Nuclear Information System (INIS)

    Prindle, N.H.; Boak, D.M.; Weiner, R.F.

    1996-05-01

    Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories for the US DOE Carlsbad Area Office (DOE/CAO). This tool provides an analytical basis for programmatic decision making for the Waste Isolation Pilot Plant (WIPP). SPM integrates decision-analysis techniques, performance,a nd risk-assessment tools, and advanced information technology. Potential outcomes of proposed activities and combination of activities are used to calculate a probability of demonstrating compliance (PDC) with selected regulations. The results are presented in a decision matrix showing cost, duration, and maximum PDC for all activities in a given cost and duration category. This is the third and final volume in the series which presents the analysis for final programmatic recommendations

  3. Advanced Numerical Tools for Design and Analysis of In-Space, Valve and Feed Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In-space valves for the main fuel and oxidizer feed systems are required to provide precise control, wide throttling range and handle rapid on-off control. These...

  4. Analysis of a Decision Support System for CASE Tool Selection and the Specification of an Ada to SQL Abstract Interface

    Science.gov (United States)

    1991-12-01

    incrementally • can be serviced and supported " are fast and inexpensive Wybolt also clarifyed how a framework was the building block of IPSE’s. He...by each. For example, a database system would provide the services of storage and retrieval of all evaluation/selection data. A knowledge base system...domain-types, TheTooLprimitive-domain-types, Specific Sofware Characteristic-primitive domain-types, The..Quality...primitive domainjtypes, TheEvaluatorj

  5. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  6. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    for the main process metrics, providing feedback to the research and development team and setting goals for experimental efforts. The present study proposes a methodology for performing such a "retro" techno-economic analysis. It consists of choosing the most important variables of the process and finding...

  7. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor...GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL 5. FUNDING NUMBERS 6. AUTHOR(S) Jonathan M. Swan 7. PERFORMING ORGANIZATION...maximum 200 words) This thesis conducts an analysis of the system requirements for the Logistics Analysis and Wargame Support Tool (LAWST). It studies

  8. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  9. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  10. Two energy system analysis - cases

    DEFF Research Database (Denmark)

    Lund, Henrik; Antonoff, Jayson; Andersen, Anders N.

    2004-01-01

    The chapter presents tow cases of energy system analysis, illustrating the types of tools and methodologies presently being used for these studies in Denamrk and elsewhere.......The chapter presents tow cases of energy system analysis, illustrating the types of tools and methodologies presently being used for these studies in Denamrk and elsewhere....

  11. Analysis on the multi-dimensional spectrum of the thrust force for the linear motor feed drive system in machine tools

    Science.gov (United States)

    Yang, Xiaojun; Lu, Dun; Ma, Chengfang; Zhang, Jun; Zhao, Wanhua

    2017-01-01

    The motor thrust force has lots of harmonic components due to the nonlinearity of drive circuit and motor itself in the linear motor feed drive system. What is more, in the motion process, these thrust force harmonics may vary with the position, velocity, acceleration and load, which affects the displacement fluctuation of the feed drive system. Therefore, in this paper, on the basis of the thrust force spectrum obtained by the Maxwell equation and the electromagnetic energy method, the multi-dimensional variation of each thrust harmonic is analyzed under different motion parameters. Then the model of the servo system is established oriented to the dynamic precision. The influence of the variation of the thrust force spectrum on the displacement fluctuation is discussed. At last the experiments are carried out to verify the theoretical analysis above. It can be found that the thrust harmonics show multi-dimensional spectrum characteristics under different motion parameters and loads, which should be considered to choose the motion parameters and optimize the servo control parameters in the high-speed and high-precision machine tools equipped with the linear motor feed drive system.

  12. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  13. Tools virtualization for command and control systems

    Science.gov (United States)

    Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław

    2017-10-01

    Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.

  14. Harnessing VLSI System Design with EDA Tools

    CERN Document Server

    Kamat, Rajanish K; Gaikwad, Pawan K; Guhilot, Hansraj

    2012-01-01

    This book explores various dimensions of EDA technologies for achieving different goals in VLSI system design. Although the scope of EDA is very broad and comprises diversified hardware and software tools to accomplish different phases of VLSI system design, such as design, layout, simulation, testability, prototyping and implementation, this book focuses only on demystifying the code, a.k.a. firmware development and its implementation with FPGAs. Since there are a variety of languages for system design, this book covers various issues related to VHDL, Verilog and System C synergized with EDA tools, using a variety of case studies such as testability, verification and power consumption. * Covers aspects of VHDL, Verilog and Handel C in one text; * Enables designers to judge the appropriateness of each EDA tool for relevant applications; * Omits discussion of design platforms and focuses on design case studies; * Uses design case studies from diversified application domains such as network on chip, hospital on...

  15. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  16. Software tools for microprocessor based systems

    CERN Document Server

    Halatsis, C

    1981-01-01

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based systems the author focuses on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilities at the register-transfer-level and architecture level. The author reviews available meta-microassemblers giving their most important features, advantages and disadvantages. He also makes extensions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities the author first discusses the simulation objectives and the criteria for choosing the right simulation language. He concentrates on simulation facilities already used in bit-slices projects and discusses the gained experience, and concludes by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the ...

  17. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  18. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-01-01

    Work is in progress on interactive tools for linear and non-linear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  19. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    Science.gov (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  20. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  1. A computer aided engineering tool for ECLS systems

    Science.gov (United States)

    Bangham, Michal E.; Reuter, James L.

    1987-01-01

    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  2. Visualization tool for advanced laser system development

    Science.gov (United States)

    Crockett, Gregg A.; Brunson, Richard L.

    2002-06-01

    Simulation development for Laser Weapon Systems design and system trade analyses has progressed to new levels with the advent of object-oriented software development tools and PC processor capabilities. These tools allow rapid visualization of upcoming laser weapon system architectures and the ability to rapidly respond to what-if scenario questions from potential user commands. These simulations can solve very intensive problems in short time periods to investigate the parameter space of a newly emerging weapon system concept, or can address user mission performance for many different scenario engagements. Equally important to the rapid solution of complex numerical problems is the ability to rapidly visualize the results of the simulation, and to effectively interact with visualized output to glean new insights into the complex interactions of a scenario. Boeing has applied these ideas to develop a tool called the Satellite Visualization and Signature Tool (SVST). This Windows application is based upon a series of C++ coded modules that have evolved from several programs at Boeing-SVS. The SVST structure, extensibility, and some recent results of applying the simulation to weapon system concepts and designs will be discussed in this paper.

  3. KAFE - A Flexible Image Analysis Tool

    Science.gov (United States)

    Burkutean, Sandra

    2017-11-01

    We present KAFE, the Keywords of Astronomical FITS-Images Explorer - a web-based FITS images post-processing analysis tool designed to be applicable in the radio to sub-mm wavelength domain. KAFE was developed to enrich selected FITS files with metadata based on a uniform image analysis approach as well as to provide advanced image diagnostic plots. It is ideally suited for data mining purposes and multi-wavelength/multi-instrument data samples that require uniform data diagnostic criteria.

  4. Developing the Polish Educational Needs Assessment Tool (Pol-ENAT) in rheumatoid arthritis and systemic sclerosis: a cross-cultural validation study using Rasch analysis.

    Science.gov (United States)

    Sierakowska, Matylda; Sierakowski, Stanisław; Sierakowska, Justyna; Horton, Mike; Ndosi, Mwidimi

    2015-03-01

    To undertake cross-cultural adaptation and validation of the educational needs assessment tool (ENAT) for use with people with rheumatoid arthritis (RA) and systemic sclerosis (SSc) in Poland. The study involved two main phases: (1) cross-cultural adaptation of the ENAT from English into Polish and (2) Cross-cultural validation of Polish Educational Needs Assessment Tool (Pol-ENAT). The first phase followed an established process of cross-cultural adaptation of self-report measures. The second phase involved completion of the Pol-ENAT by patients and subjecting the data to Rasch analysis to assess the construct validity, unidimensionality, internal consistency and cross-cultural invariance. An adequate conceptual equivalence was achieved following the adaptation process. The dataset for validation comprised a total of 278 patients, 237 (85.3 %) of which were female. In each disease group (145, RA and 133, SSc), the 7 domains of the Pol-ENAT were found to fit the Rasch model, X (2)(df) = 16.953(14), p = 0.259 and 8.132(14), p = 0.882 for RA and SSc, respectively. Internal consistency of the Pol-ENAT was high (patient separation index = 0.85 and 0.89 for SSc and RA, respectively), and unidimensionality was confirmed. Cross-cultural differential item functioning (DIF) was detected in some subscales, and DIF-adjusted conversion tables were calibrated to enable cross-cultural comparison of data between Poland and the UK. Using a standard process in cross-cultural adaptation, conceptual equivalence was achieved between the original (UK) ENAT and the adapted Pol-ENAT. Fit to the Rasch model, confirmed that the construct validity, unidimensionality and internal consistency of the ENAT have been preserved.

  5. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)). © The Author(s) 2016.

  6. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  7. Integrated Design Tools for Embedded Control Systems

    NARCIS (Netherlands)

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2001-01-01

    Currently, computer-based control systems are still being implemented using the same techniques as 10 years ago. The purpose of this project is the development of a design framework, consisting of tools and libraries, which allows the designer to build high reliable heterogeneous real-time embedded

  8. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  9. Design tools for complex dynamic security systems.

    Energy Technology Data Exchange (ETDEWEB)

    Byrne, Raymond Harry; Rigdon, James Brian; Rohrer, Brandon Robinson; Laguna, Glenn A.; Robinett, Rush D. III (.; ); Groom, Kenneth Neal; Wilson, David Gerald; Bickerstaff, Robert J.; Harrington, John J.

    2007-01-01

    The development of tools for complex dynamic security systems is not a straight forward engineering task but, rather, a scientific task where discovery of new scientific principles and math is necessary. For years, scientists have observed complex behavior but have had difficulty understanding it. Prominent examples include: insect colony organization, the stock market, molecular interactions, fractals, and emergent behavior. Engineering such systems will be an even greater challenge. This report explores four tools for engineered complex dynamic security systems: Partially Observable Markov Decision Process, Percolation Theory, Graph Theory, and Exergy/Entropy Theory. Additionally, enabling hardware technology for next generation security systems are described: a 100 node wireless sensor network, unmanned ground vehicle and unmanned aerial vehicle.

  10. A Collaborative Analysis Tool for Integrated Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald; Landrum, Brian

    2000-01-01

    the process may be repeated altering the trajectory or some other input to reduce the TPS mass. E-PSURBCC is an "engine performance" model and requires the specification of inlet air static temperature and pressure as well as Mach number (which it pulls from the HYFIM and POST trajectory files), and calculates the corresponding stagnation properties. The engine air flow path geometry includes inlet, a constant area section where the rocket is positioned, a subsonic diffuser, a constant area afterburner, and either a converging nozzle or a converging-diverging nozzle. The current capabilities of E-PSURBCC ejector and ramjet mode treatment indicated that various complex flow phenomena including multiple choking and internal shocks can occur for combinations of geometry/flow conditions. For a given input deck defining geometry/flow conditions, the program first goes through a series of checks to establish whether the input parameters are sound in terms of a solution path. If the vehicle/engine performance fails mission goals, the engineer is able to collaboratively alter the vehicle moldline to change aerodynamics, or trajectory, or some other input to achieve orbit. The problem described is an example of the need for collaborative design and analysis. RECIPE is a cross-platform application capable of hosting a number of engineers and designers across the Internet for distributed and collaborative engineering environments. Such integrated system design environments allow for collaborative team design analysis for performing individual or reduced team studies. To facilitate the larger number of potential runs that may need to be made, RECIPE connects the computer codes that calculate the trajectory data, aerodynamic data based on vehicle geometry, heat rate data, TPS masses, and vehicle and engine performance, so that the output from each tool is easily transferred to the model input files that need it.

  11. PEPS. A Tool for Power System Simulation

    Science.gov (United States)

    Fernandez, Arturo; D'Accolti, Gianfelice; Buergler, Brandon; Garcia, Borja

    2014-08-01

    Missions are becoming more and more complex and sometimes, figuring out what is the worst case orbit for energy balance in a satellite is becoming quite complex. Hence, simulation is paramount to perform a reliable analysis in order to understand if there is enough energy to power the satellite and recharge the battery. PEPS is a tool developed internally in ESA based on EcosimPro software. Experts from quite different fields have been involved in its development to end up having a flexible and powerful tool to perform energy balance simulations for conventional and non-conventional satellites.

  12. The Tools That Help Systems Engineering

    Science.gov (United States)

    Gamertsfelder, Jacob O.

    2017-01-01

    There are many tools that systems engineers use in today's space programs. In my time in the Commercial Crew Program I sought to improve one of the vital tools for the verification and validation team. This was my main project but only a small part of what I have done in the department. I have also had the chance to learn from the best and see actual hardware, this real world experience will help me be a better aerospace engineer when I enter the workforce. I look forward to seeing the Commercial Crew Program progress to launch.

  13. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  14. Integrated information systems: tools for affiliation.

    Science.gov (United States)

    Katz, P A

    1995-05-01

    The development of healthcare information systems as tools to facilitate business success traditionally has focused on a single profit center. The cataclysmic changes that are reshaping American healthcare have accelerated the affiliation of provider entities to form systems that have broader geographic coverage, deliver a larger portion of the provider continuum of care, and are more attractive partners for purchasers of healthcare services. Integrated and/or standardized information system technologies can provide great benefit in containing costs and optimizing quality among partners resulting from healthcare mergers and acquisitions as the new entity positions itself to increase market share and enhance its bottom line. Graduate Health System (GHS) in the Philadelphia area has demonstrated the efficacy of its investment to build a system-wide information network that delivers multiple standardized, integrated systems for both the financial and operational advantage of all component institutions. Building on its patient information system as the centerpiece of its information strategy, GHS has deployed a variety of tools across its network infrastructure that have brought new capabilities and savings to its member hospitals. In light of some of GHS's successes, other hospitals and healthcare systems may be able to re-create similar savings and efficiencies by applying like strategies in cooperation with their affiliated partners.

  15. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-02-21

    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed and simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.

  16. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  17. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  18. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  19. DNA – A General Energy System Simulation Tool

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic...... operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also...

  20. Tool, weapon, or white elephant? A realist analysis of the five phases of a twenty-year programme of occupational health information system implementation in the health sector

    Directory of Open Access Journals (Sweden)

    Spiegel Jerry M

    2012-08-01

    Full Text Available Abstract Background Although information systems (IS have been extensively applied in the health sector worldwide, few initiatives have addressed the health and safety of health workers, a group acknowledged to be at high risk of injury and illness, as well as in great shortage globally, particularly in low and middle-income countries. Methods Adapting a context-mechanism-outcome case study design, we analyze our team’s own experience over two decades to address this gap: in two different Canadian provinces; and two distinct South African settings. Applying a realist analysis within an adapted structuration theory framing sensitive to power relations, we explore contextual (socio-political and technological characteristics and mechanisms affecting outcomes at micro, meso and macro levels. Results Technological limitations hindered IS usefulness in the initial Canadian locale, while staffing inadequacies amid pronounced power imbalances affecting governance restricted IS usefulness in the subsequent Canadian application. Implementation in South Africa highlighted the special care needed to address power dynamics regarding both worker-employer relations (relevant to all occupational health settings and North–south imbalances (common to all international interactions. Researchers, managers and front-line workers all view IS implementation differently; relationships amongst the workplace parties and between community and academic partners have been pivotal in determining outcome in all circumstances. Capacity building and applying creative commons and open source solutions are showing promise, as is international collaboration. Conclusions There is worldwide consensus on the need for IS use to protect the health workforce. However, IS implementation is a resource-intensive undertaking; regardless of how carefully designed the software, contextual factors and the mechanisms adopted to address these are critical to mitigate threats and achieve

  1. Tool, weapon, or white elephant? A realist analysis of the five phases of a twenty-year programme of occupational health information system implementation in the health sector.

    Science.gov (United States)

    Spiegel, Jerry M; Lockhart, Karen; Dyck, Carmen; Wilson, Andrea; O'Hara, Lyndsay; Yassi, Annalee

    2012-08-06

    Although information systems (IS) have been extensively applied in the health sector worldwide, few initiatives have addressed the health and safety of health workers, a group acknowledged to be at high risk of injury and illness, as well as in great shortage globally, particularly in low and middle-income countries. Adapting a context-mechanism-outcome case study design, we analyze our team's own experience over two decades to address this gap: in two different Canadian provinces; and two distinct South African settings. Applying a realist analysis within an adapted structuration theory framing sensitive to power relations, we explore contextual (socio-political and technological) characteristics and mechanisms affecting outcomes at micro, meso and macro levels. Technological limitations hindered IS usefulness in the initial Canadian locale, while staffing inadequacies amid pronounced power imbalances affecting governance restricted IS usefulness in the subsequent Canadian application. Implementation in South Africa highlighted the special care needed to address power dynamics regarding both worker-employer relations (relevant to all occupational health settings) and North-south imbalances (common to all international interactions). Researchers, managers and front-line workers all view IS implementation differently; relationships amongst the workplace parties and between community and academic partners have been pivotal in determining outcome in all circumstances. Capacity building and applying creative commons and open source solutions are showing promise, as is international collaboration. There is worldwide consensus on the need for IS use to protect the health workforce. However, IS implementation is a resource-intensive undertaking; regardless of how carefully designed the software, contextual factors and the mechanisms adopted to address these are critical to mitigate threats and achieve outcomes of interest to all parties. Issues specific to IS development

  2. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  3. Embedded Systems Development Tools: A MODUS-oriented Market Overview

    Directory of Open Access Journals (Sweden)

    Loupis Michalis

    2014-03-01

    Full Text Available Background: The embedded systems technology has perhaps been the most dominating technology in high-tech industries, in the past decade. The industry has correctly identified the potential of this technology and has put its efforts into exploring its full potential. Objectives: The goal of the paper is to explore the versatility of the application in the embedded system development based on one FP7-SME project. Methods/Approach: Embedded applications normally demand high resilience and quality, as well as conformity to quality standards and rigid performance. As a result embedded system developers have adopted software methods that yield high quality. The qualitative approach to examining embedded systems development tools has been applied in this work. Results: This paper presents a MODUS-oriented market analysis in the domains of Formal Verification tools, HW/SW co-simulation tools, Software Performance Optimization tools and Code Generation tools. Conclusions: The versatility of applications this technology serves is amazing. With all this performance potential, the technology has carried with itself a large number of issues which the industry essentially needs to resolve to be able to harness the full potential contained. The MODUS project toolset addressed four discrete domains of the ESD Software Market, in which corresponding open tools were developed

  4. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  5. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  6. Software tools for microprocessor based systems

    International Nuclear Information System (INIS)

    Halatsis, C.

    1981-01-01

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  7. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  8. ASPECT (Automated System-level Performance Evaluation and Characterization Tool), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI has developed a suite of SAA tools and an analysis capability referred to as ASPECT (Automated System-level Performance Evaluation and Characterization Tool)....

  9. Information security system quality assessment through the intelligent tools

    Science.gov (United States)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  10. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... and the value of associated rewards in states of interest for a real-world example from a case company in the Danish baked goods industry. The developments are presented in a generalised fashion to make them relevant to the general problem of implementing quantitative probabilistic model checking of graph...

  11. Comparative guide to emerging diagnostic tools for large commercial HVAC systems

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, Hannah; Piette, Mary Ann

    2001-05-01

    This guide compares emerging diagnostic software tools that aid detection and diagnosis of operational problems for large HVAC systems. We have evaluated six tools for use with energy management control system (EMCS) or other monitoring data. The diagnostic tools summarize relevant performance metrics, display plots for manual analysis, and perform automated diagnostic procedures. Our comparative analysis presents nine summary tables with supporting explanatory text and includes sample diagnostic screens for each tool.

  12. Emulation tool of dynamic systems via internet

    Directory of Open Access Journals (Sweden)

    Daniel Ruiz Olaya

    2015-11-01

    Full Text Available The experimentation laboratories for the studies of control system courses can become expensive, either in its acquisition, operation or maintenance. An alternative resource have been the remote laboratories. However, not always is possible to get complex systems. A solution to this matter are the remote emulation laboratories. In this paper describes the development of a Web application for the emulation of dynamic systems using a free-distribution software tool of rapid control prototyping based on Linux/RTAI. This application is focused especially for the experimentation with dynamic systems that are not available easily in a laboratory where the model have been configured by the user. The design of the front-end and the back-end are presented. The latency times of the real-time operating system and the ability of the system to reproduce similar signals to a real system from an emulated model were verified. An example, to test the functionality of the application the model of an evaporator was used. One of the advantages of the application is the work methodology which is based on the development of blocks in Scicos. This allows the user to reuse those parameters and the code that was implemented to build a block on the Scicos toolbox with the Linux/RTAI/ScicosLab environment. Moreover, only a web-browser and the Java Virtual Machine are required.

  13. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    Science.gov (United States)

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  14. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  15. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  16. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  17. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  18. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  19. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  20. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  1. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  2. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  3. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  4. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  5. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  6. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  7. A Collaborative Analysis Tool for Integrating Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald

    1999-01-01

    Presented is a computer-based tool that connects several disciplines that are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to every other system, as is the case of SSTO vehicles with air breathing propulsion, which is currently being studied by NASA. The deficiencies in the scramjet powered concept led to a revival of interest in Rocket-Based Combined-Cycle (RBCC) propulsion systems. An RBCC propulsion system integrates airbreathing and rocket propulsion into a single engine assembly enclosed within a cowl or duct. A typical RBCC propulsion system operates as a ducted rocket up to approximately Mach 3. At this point the transitions to a ramjet mode for supersonic-to-hypersonic acceleration. Around Mach 8 the engine transitions to a scram4jet mode. During the ramjet and scramjet modes, the integral rockets operate as fuel injectors. Around Mach 10-12 (the actual value depends on vehicle and mission requirements), the inlet is physically closed and the engine transitions to an integral rocket mode for orbit insertion. A common feature of RBCC propelled vehicles is the high degree of integration between the propulsion system and airframe. At high speeds the vehicle forebody is fundamentally part of the engine inlet, providing a compression surface for air flowing into the engine. The compressed air is mixed with fuel and burned. The combusted mixture must be expanded to an area larger than the incoming stream to provide thrust. Since a conventional nozzle would be too large, the entire lower after body of the vehicle is used as an expansion surface. Because of the high external temperatures seen during atmospheric flight, the design of an airbreathing SSTO vehicle requires delicate tradeoffs between engine design, vehicle shape, and thermal protection system (TPS) sizing in order to produce an optimum system in terms of weight (and cost) and maximum performance.

  8. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  9. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  10. ANN Based Tool Condition Monitoring System for CNC Milling Machines

    Directory of Open Access Journals (Sweden)

    Mota-Valtierra G.C.

    2011-10-01

    Full Text Available Most of the companies have as objective to manufacture high-quality products, then by optimizing costs, reducing and controlling the variations in its production processes it is possible. Within manufacturing industries a very important issue is the tool condition monitoring, since the tool state will determine the quality of products. Besides, a good monitoring system will protect the machinery from severe damages. For determining the state of the cutting tools in a milling machine, there is a great variety of models in the industrial market, however these systems are not available to all companies because of their high costs and the requirements of modifying the machining tool in order to attach the system sensors. This paper presents an intelligent classification system which determines the status of cutt ers in a Computer Numerical Control (CNC milling machine. This tool state is mainly detected through the analysis of the cutting forces drawn from the spindle motors currents. This monitoring system does not need sensors so it is no necessary to modify the machine. The correct classification is made by advanced digital signal processing techniques. Just after acquiring a signal, a FIR digital filter is applied to the data to eliminate the undesired noisy components and to extract the embedded force components. A Wavelet Transformation is applied to the filtered signal in order to compress the data amount and to optimize the classifier structure. Then a multilayer perceptron- type neural network is responsible for carrying out the classification of the signal. Achieving a reliability of 95%, the system is capable of detecting breakage and a worn cutter.

  11. Multi-Spacecraft Analysis with Generic Visualization Tools

    Science.gov (United States)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  12. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  13. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  14. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  15. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  16. Analysis and use of OLAP tools in the corporative information system of CFE; Analisis y empleo de herramientas OLAP en el sistema de informacion corporativa de la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Jacome G, Norma E; Argotte R, Liliana P; Mejia L, Manuel [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2003-07-01

    The commercial tools Oracle Express and Oracle the Discoverer are presented, applied and compared. Both of them were applied in the context of the Data Storage that Comision Federal de Electricidad (CFE) develops, denominated Corporative Information System (SICORP) that involves the handling of large volumes of historical and present information. By the importance and the relevance that at the moment the subject of data storage has, the experiences that are described in the article are very useful for future developments. [Spanish] Se presentan, aplican y comparan las herramientas comerciales Oracle Express y Oracle Discoverer. Ambas fueron aplicadas en el contexto del Almacen de Datos que la Comision Federal de Electricidad (CFE) desarrolla, denominado Sistema de Informacion Corporativa (Sicorp), que involucra el manejo de grandes volumenes de informacion historica y actual. Por la importancia y el auge que actualmente tiene el tema de almacenes de datos, las experiencias que se describen en el articulo son de gran utilidad para futuros desarrollos.

  17. Tools for designing industrial vision systems

    Science.gov (United States)

    Batchelor, Bruce G.

    1991-09-01

    The cost of commissioning and installing a machine vision system is almost always dominated by that of designing it. Indeed, the cost of design and the shortage of skilled vision systems engineers are together likely to be two of the most important factors limiting the future adoption of this technology by manufacturing industry. The article describes several software tools that have been developed for making the design process easier, cheaper and faster. These include: (a) An extension of Prolog, called Prolog+. This is intended for prototyping intelligent image processing, as well as for programming future target systems. (b) A knowledge-based program intended to assist an engineer to select a suitable lighting and image acquisition sub-system. This called a Lighting Advisor. (c) A knowledge-based program which advises an engineer on how to select a suitable lens. This called a Lens Advisor. (d) A knowledge-based program which assists an engineer to choose a suitable camera. This called a Camera Advisor. Ideally, items (b) to (d) should be integrated with Prolog+, so that a programmer has access to all of them in one unified working environment. Prolog+ is able to accept simple natural language descriptions (i.e., in a simple sub-set of English) of the objects/scenes that are to be inspected and is able to generate a recognition program automatically. A range of inspection tasks is described, in which Automated Visual Inspection has, to date, made no real impact. Amongst these is the inspection of products that are made in very small quantities. An electro-mechanical arrangement, called a Flexible Inspection Cell, is described. This is intended to provide a "general purpose" inspection facility for small-batch artifacts. Such a cell is controlled using Prolog+.

  18. Systems biology: A tool for charting the antiviral landscape.

    Science.gov (United States)

    Bowen, James R; Ferris, Martin T; Suthar, Mehul S

    2016-06-15

    The host antiviral programs that are initiated following viral infection form a dynamic and complex web of responses that we have collectively termed as "the antiviral landscape". Conventional approaches to studying antiviral responses have primarily used reductionist systems to assess the function of a single or a limited subset of molecules. Systems biology is a holistic approach that considers the entire system as a whole, rather than individual components or molecules. Systems biology based approaches facilitate an unbiased and comprehensive analysis of the antiviral landscape, while allowing for the discovery of emergent properties that are missed by conventional approaches. The antiviral landscape can be viewed as a hierarchy of complexity, beginning at the whole organism level and progressing downward to isolated tissues, populations of cells, and single cells. In this review, we will discuss how systems biology has been applied to better understand the antiviral landscape at each of these layers. At the organismal level, the Collaborative Cross is an invaluable genetic resource for assessing how genetic diversity influences the antiviral response. Whole tissue and isolated bulk cell transcriptomics serves as a critical tool for the comprehensive analysis of antiviral responses at both the tissue and cellular levels of complexity. Finally, new techniques in single cell analysis are emerging tools that will revolutionize our understanding of how individual cells within a bulk infected cell population contribute to the overall antiviral landscape. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  20. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  1. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Ould Bouamama, B.

    2011-01-01

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  2. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  3. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  4. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    International Nuclear Information System (INIS)

    Ourghanlian, Alain

    2015-01-01

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  5. Design and Application of the Exploration Maintainability Analysis Tool

    Science.gov (United States)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  6. RNAmute: RNA secondary structure mutation analysis tool

    Directory of Open Access Journals (Sweden)

    Barash Danny

    2006-04-01

    Full Text Available Abstract Background RNAMute is an interactive Java application that calculates the secondary structure of all single point mutations, given an RNA sequence, and organizes them into categories according to their similarity with respect to the wild type predicted structure. The secondary structure predictions are performed using the Vienna RNA package. Several alternatives are used for the categorization of single point mutations: Vienna's RNAdistance based on dot-bracket representation, as well as tree edit distance and second eigenvalue of the Laplacian matrix based on Shapiro's coarse grain tree graph representation. Results Selecting a category in each one of the processed tables lists all single point mutations belonging to that category. Selecting a mutation displays a graphical drawing of the single point mutation and the wild type, and includes basic information such as associated energies, representations and distances. RNAMute can be used successfully with very little previous experience and without choosing any parameter value alongside the initial RNA sequence. The package runs under LINUX operating system. Conclusion RNAMute is a user friendly tool that can be used to predict single point mutations leading to conformational rearrangements in the secondary structure of RNAs. In several cases of substantial interest, notably in virology, a point mutation may lead to a loss of important functionality such as the RNA virus replication and translation initiation because of a conformational rearrangement in the secondary structure.

  7. Tools for Nonlinear Control Systems Design

    Science.gov (United States)

    Sastry, S. S.

    1997-01-01

    This is a brief statement of the research progress made on Grant NAG2-243 titled "Tools for Nonlinear Control Systems Design", which ran from 1983 till December 1996. The initial set of PIs on the grant were C. A. Desoer, E. L. Polak and myself (for 1983). From 1984 till 1991 Desoer and I were the Pls and finally I was the sole PI from 1991 till the end of 1996. The project has been an unusually longstanding and extremely fruitful partnership, with many technical exchanges, visits, workshops and new avenues of investigation begun on this grant. There were student visits, long term.visitors on the grant and many interesting joint projects. In this final report I will only give a cursory description of the technical work done on the grant, since there was a tradition of annual progress reports and a proposal for the succeeding year. These progress reports cum proposals are attached as Appendix A to this report. Appendix B consists of papers by me and my students as co-authors sorted chronologically. When there are multiple related versions of a paper, such as a conference version and journal version they are listed together. Appendix C consists of papers by Desoer and his students as well as 'solo' publications by other researchers supported on this grant similarly chronologically sorted.

  8. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  9. SOFTWARE TOOLS FOR COMPUTING EXPERIMENT AIMED AT MULTIVARIATE ANALYSIS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    A. V. Tyurin

    2015-09-01

    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  10. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  11. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  12. Aviation Environmental Design Tool (AEDT) System Architecture

    Science.gov (United States)

    2007-01-29

    The Federal Aviation Administration's Office of Environment and Energy (FAA-AEE) is : developing a comprehensive suite of software tools that will allow for thorough assessment of the environmental effects of aviation. The main goal of the effort is ...

  13. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  14. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  15. Hybrid-Electric Aircraft TOGW Development Tool with Empirically-Based Airframe and Physics-Based Hybrid Propulsion System Component Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Hybrid-Electric distributed propulsion (HEDP) is becoming widely accepted and new tools will be required for future development. This Phase I SBIR proposal creates a...

  16. Design package for fuel retrieval system fuel handling tool modification

    International Nuclear Information System (INIS)

    TEDESCHI, D.J.

    1998-01-01

    This is a design package that contains the details for a modification to a tool used for moving fuel elements during loading of MCO Fuel Baskets for the Fuel Retrieval System. The tool is called the fuel handling tool (or stinger). This document contains requirements, development design information, tests, and test reports

  17. Expert Systems as Tools for Technical Communicators.

    Science.gov (United States)

    Grider, Daryl A.

    1994-01-01

    Discusses expertise, what an expert system is, what an expert system shell is, what expert systems can and cannot do, knowledge engineering and technical communicators, and planning and managing expert system projects. (SR)

  18. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    Ding, Y.

    1996-01-01

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  19. Systems analysis with an integrated medical analysis system (IMAS)

    Science.gov (United States)

    Koelsch, John; Mabry, Susan L.; Rodriquez, Samuel; Takemura, Paul

    1998-05-01

    This paper describes the integrated medical analysis system (IMAS) The evolving system consists of an integrated suite of models and tools providing quantitative and dynamic analysis from multiple physiological function models, clinical care patient input, medical device data, and integrated medical systems. The system is being developed for requirements definition, patient assessment, control theory, training, instrumentation testing and validation. Traditionally, human models and simulations are performed on small scale, isolated problems, usually consisting of detached mathematical models or measurements studies. These systems are not capable of portraying the interactive effects of such systems and certainly are not capable of integrating multiple external entities such as device data, patient data, etc. The human body in and of itself is a complex, integrated system. External monitors, treatments, and medical conditions interact at yet another level. Hence, a highly integrated, interactive simulation system with detailed subsystem models is required for effective quantitative analysis. The current prototype emphasizes cardiovascular, respiratory and thermoregulatory functions with integration of patient device data. Unique system integration of these components is achieved through four facilitators. These facilitators include a distributed interactive computing architecture, application of fluid and structural engineering principles to the models, real-time scientific visualization, and application of strong system integration principles. The IMAS forms a complex analytical tool with emphasis on integration and interaction at multiple levels between components. This unique level of integration and interaction facilitates quantitative analysis for multiple purposes and varying levels of fidelity. An overview of the project and preliminary findings are introduced.

  20. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  1. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  2. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  3. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  4. Buffer$--An Economic Analysis Tool

    Science.gov (United States)

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  5. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  6. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  7. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  8. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  9. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  10. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  11. Computer tools for systems engineering at LaRC

    Science.gov (United States)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  12. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  13. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  14. IIS--Integrated Interactome System: a web-based platform for the annotation, analysis and visualization of protein-metabolite-gene-drug interactions by integrating a variety of data sources and tools.

    Science.gov (United States)

    Carazzolle, Marcelo Falsarella; de Carvalho, Lucas Miguel; Slepicka, Hugo Henrique; Vidal, Ramon Oliveira; Pereira, Gonçalo Amarante Guimarães; Kobarg, Jörg; Meirelles, Gabriela Vaz

    2014-01-01

    High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two

  15. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  16. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  17. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  18. System for exchanging tools and end effectors on a robot

    International Nuclear Information System (INIS)

    Burry, D.B.; Williams, P.M.

    1991-01-01

    A system and method for exchanging tools and end effectors on a robot permits exchange during a programmed task. The exchange mechanism is located off the robot, thus reducing the mass of the robot arm and permitting smaller robots to perform designated tasks. A simple spring/collet mechanism mounted on the robot is used which permits the engagement and disengagement of the tool or end effector without the need for a rotational orientation of the tool to the end effector/collet interface. As the tool changing system is not located on the robot arm no umbilical cords are located on robot. 12 figures

  19. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  20. ATENA – A tool for engineering analysis of fracture in concrete

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    ATENA – A tool for engineering analysis of fracture in concrete. VLADIMIR CERVENKA, JAN CERVENKA and RADOMIR PUKL. Cervenka Consulting, Prague, Czech Republic e-mail: cervenka@cervenka.cz. Abstract. Advanced constitutive models implemented in the finite element system ATENA serve as rational tools to ...

  1. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  2. Geographical information systems (GIS), a great tool for urban silviculture

    International Nuclear Information System (INIS)

    Otaya Burbano, Leodan Andres; Sanchez Zapata, Robinson de Jesus; Morales Soto, Leon; Botero Fernandez, Veronica

    2006-01-01

    As a pilot phase, to acquire experience, define methodologies and determine the advantages of Geographical Information Systems (GIS) for applying to urban silviculture and inventory, diagnosis, management plan and economic appraisal were made for the urban forest in the Magnolia neighborhood of the Envigado municipality, department of Antioquia, Colombia. for the management and analysis of the data collected in field, a database was designed using the software microsoft Access. The species inventoried were mapped digitally and the conditions there were analyzed using some tools and extensions of technological architecture ArcGIS 8.3 such as: characteristics, silviculture practices required, and environmental conflicts. It was determined that the GIS analysis of the urban forest conducted for a specific neighborhood can be a tool that permits environmental authorities and interested researchers to have agile and easy access to the information stored in it; it permits programming of required silviculture activities; it also permits having a general vision of the urban forest according to the infrastructure of the neighborhood, complemented by photographs of the area for improved illustration; it permits the inclusion or elimination of information in a rapid and simple manner, thus facilitating decision making with relation to management of the urban woodland and for comparison with other similar studies

  3. A model of tool wear monitoring system for turning

    OpenAIRE

    Šimunović, Goran; Ficko, Mirko; Šarić, Tomislav; Milošević, Mijodrag; Antić, Aco

    2015-01-01

    Acquiring high-quality and timely information on the tool wear condition in real time, presents a necessary prerequisite for identification of tool wear degree, which significantly improves the stability and quality of the machining process. Defined in this paper is a model of tool wear monitoring system with special emphasis on the module for acquisition and processing of vibration acceleration signal by applying discrete wavelet transformations (DWT) in signal decomposition. The paper prese...

  4. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  5. A Tool for the Concise Analysis of Patient Safety Incidents.

    Science.gov (United States)

    Pham, Julius Cuong; Hoffman, Carolyn; Popescu, Ioana; Ijagbemi, O Mayowa; Carson, Kathryn A

    2016-01-01

    Patient safety incidents, sometimes referred to as adverse events, incidents, or patient safety events, are too common an occurrence in health care. Most methods for incident analysis are time and labor intensive. Given the significant resource requirements of a root cause analysis, for example, there is a need for a more targeted and efficient method of analyzing a larger number of incidents. Although several concise incident analysis tools are in existence, there are no published studies regarding their usability or effectiveness. Building on previous efforts, a Concise Incident Analysis (CIA) methodology and tool were developed to facilitate analysis of no- or low-harm incidents. Staff from 11 hospitals in five countries-Australia, Canada, Hong Kong, India, and the United States-pilot tested the tool in two phases. The tool was evaluated and refined after each phase on the basis of user perceptions of usability and effectiveness. From September 2013 through January 2014, 52 patient safety incidents were analyzed. A broad variety of incident types were investigated, the most frequent being patient falls (25%). Incidents came from a variety of hospital work areas, the most frequent being from the medical ward (37%). Most incidents investigated resulted in temporary harm or no harm (94%). All or most sites found the tool "understandable" (100%), "easy to use" (89%), and "effective" (89%). Some 95% of participants planned to continue to use all or some parts of the tool after the pilot. Qualitative feedback suggested that the tool allowed analysis of incidents that were not currently being analyzed because of insufficient resources. The tool was described as simple to use, easy to document, and aligned with the flow of the incident analysis. A concise tool for the investigation of patient safety incidents with low or no harm was well accepted across a select group of hospitals from five countries.

  6. Active ultrasound pattern injection system (AUSPIS for interventional tool guidance.

    Directory of Open Access Journals (Sweden)

    Xiaoyu Guo

    Full Text Available Accurate tool tracking is a crucial task that directly affects the safety and effectiveness of many interventional medical procedures. Compared to CT and MRI, ultrasound-based tool tracking has many advantages, including low cost, safety, mobility and ease of use. However, surgical tools are poorly visualized in conventional ultrasound images, thus preventing effective tool tracking and guidance. Existing tracking methods have not yet provided a solution that effectively solves the tool visualization and mid-plane localization accuracy problem and fully meets the clinical requirements. In this paper, we present an active ultrasound tracking and guiding system for interventional tools. The main principle of this system is to establish a bi-directional ultrasound communication between the interventional tool and US imaging machine within the tissue. This method enables the interventional tool to generate an active ultrasound field over the original imaging ultrasound signals. By controlling the timing and amplitude of the active ultrasound field, a virtual pattern can be directly injected into the US machine B mode display. In this work, we introduce the time and frequency modulation, mid-plane detection, and arbitrary pattern injection methods. The implementation of these methods further improves the target visualization and guiding accuracy, and expands the system application beyond simple tool tracking. We performed ex vitro and in vivo experiments, showing significant improvements of tool visualization and accurate localization using different US imaging platforms. An ultrasound image mid-plane detection accuracy of ±0.3 mm and a detectable tissue depth over 8.5 cm was achieved in the experiment. The system performance is tested under different configurations and system parameters. We also report the first experiment of arbitrary pattern injection to the B mode image and its application in accurate tool tracking.

  7. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  8. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  9. Theoretical analysis tools in building business competitiveness

    Directory of Open Access Journals (Sweden)

    Yeisson Diego Tamayo

    2015-12-01

    Full Text Available Due to the internationalization of markets from firms free trade agreements and the standardization process in Colombia companies increasingly seek to satisfy the needs of their customers, so the business management systems take a heavy weight business development phases. In Colombia management systems have a boom in the financial system, so much so that there is manual quality of financial supervision of Colombia, but at the microeconomic level firms have not developed or at least there is no evidence of development that topic. Therefore it is necessary to analyze models of business management at international level in order to identify which elements or strategies can be applied by stages of business development, based on the measurement of indicator variables compliance management department in context Colombian.

  10. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  11. Conception of a PWR simulator as a tool for safety analysis

    International Nuclear Information System (INIS)

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.

    1982-09-01

    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  12. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  13. Constructing Social Networks From Secondary Storage With Bulk Analysis Tools

    Science.gov (United States)

    2016-06-01

    displays more than one social network that needs to be separated. The drive owner of Component d9c1 was an administrator for an email server, and would...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS CONSTRUCTING SOCIAL NETWORKS FROM SECONDARY STORAGE WITH BULK ANALYSIS TOOLS by Janina L. Green...AND SUBTITLE CONSTRUCTING SOCIAL NETWORKS FROM SECONDARY STORAGE WITH BULK ANALYSIS TOOLS 5. FUNDING NUMBERS 6. AUTHOR(S) Janina L. Green 7. PERFORMING

  14. EXPERT SYSTEMS - DEVELOPMENT OF AGRICULTURAL INSURANCE TOOL

    Directory of Open Access Journals (Sweden)

    NAN Anca-Petruţa

    2013-07-01

    Full Text Available Because of the fact that specialty agricultural assistance is not always available when the farmers need it, we identified expert systems as a strong instrument with an extended potential in agriculture. This started to grow in scale recently, including all socially-economic activity fields, having the role of collecting data regarding different aspects from human experts with the purpose of assisting the user in the necessary steps for solving problems, at the performance level of the expert, making his acquired knowledge and experience available. We opted for a general presentation of the expert systems as well as their necessity, because, the solution to develop the agricultural system can come from artificial intelligence by implementing the expert systems in the field of agricultural insurance, promoting existing insurance products, farmers finding options in depending on their necessities and possibilities. The objective of this article consists of collecting data about different aspects about specific areas of interest of agricultural insurance, preparing the database, a conceptual presentation of a pilot version which will become constantly richer depending on the answers received from agricultural producers, with the clearest exposure of knowledgebase possible. We can justify picking this theme with the fact that even while agricultural insurance plays a very important role in agricultural development, the registered result got from them are modest, reason why solutions need to be found in the scope of developing the agricultural sector. The importance of this consists in the proposal of an immediate viable solution to correspond with the current necessities of agricultural producers and in the proposal of an innovative solution, namely the implementation of expert system in agricultural insurance as a way of promoting insurance products. Our research, even though it treats the subject at an conceptual level, it wants to undertake an

  15. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  16. Comparison of emerging diagnostic tools for large commercial HVAC systems

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, Hannah; Piette, Mary Ann

    2001-04-06

    Diagnostic software tools for large commercial buildings are being developed to help detect and diagnose energy and other performance problems with building operations. These software applications utilize energy management control system (EMCS) trend log data. Due to the recent development of diagnostic tools, there has been little detailed comparison among the tools and a limited awareness of tool capabilities by potential users. Today, these diagnostic tools focus mainly on air handlers, but the opportunity exists for broadening the scope of the tools to include all major parts of heating, cooling, and ventilation systems in more detail. This paper compares several tools in the following areas: (1) Scope, intent, and background; (2) Data acquisition, pre-processing, and management; (3) Problems detected; (4) Raw data visualization; (5) Manual and automated diagnostic methods and (6) Level of automation. This comparison is intended to provide practitioners and researchers with a picture of the current state of diagnostic tools. There is tremendous potential for these tools to help improve commercial building energy and non-energy performance.

  17. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  18. Integrating decision support tools and environmental information systems: a case study on the Province of Milan

    International Nuclear Information System (INIS)

    Bagli, S.; Pistocchi, A.; Mazzoli, P.; Valentini, P.

    2006-01-01

    The paper demonstrates an application of advanced decision support tools within the framework of the environmental information system of the Province of Milan. These tools include environmental simulation models, multi criteria analysis, risk analysis and environmental accounting for marketable emission permits. After describing the general structure of the system, three demonstrational case studies are introduced concerning: groundwater pollution management; atmospheric pollution management; urban environmental quality perception and management. In the conclusion, potential use of tools like the ones implemented by the province of Milan within the framework of Local Agenda 21 processes is recalled [it

  19. Tool for Insider Threat Detection in Corporative Information Systems

    Directory of Open Access Journals (Sweden)

    Victor Sergeevich Vedeneev

    2014-02-01

    Full Text Available Systems and tools for insider threat detection are described. Different meanings of the term “insider”, types of insiders, examples of motivation of insiders, typical insider actions are set.

  20. Recent advances in systems metabolic engineering tools and strategies.

    Science.gov (United States)

    Chae, Tong Un; Choi, So Young; Kim, Je Woong; Ko, Yoo-Sung; Lee, Sang Yup

    2017-10-01

    Metabolic engineering has been playing increasingly important roles in developing microbial cell factories for the production of various chemicals and materials to achieve sustainable chemical industry. Nowadays, many tools and strategies are available for performing systems metabolic engineering that allows systems-level metabolic engineering in more sophisticated and diverse ways by adopting rapidly advancing methodologies and tools of systems biology, synthetic biology and evolutionary engineering. As an outcome, development of more efficient microbial cell factories has become possible. Here, we review recent advances in systems metabolic engineering tools and strategies together with accompanying application examples. In addition, we describe how these tools and strategies work together in simultaneous and synergistic ways to develop novel microbial cell factories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Analyzing Real-Time Systems: Theory and Tools

    DEFF Research Database (Denmark)

    Hune, Thomas Seidelin

    The main topic of this dissertation is the development and use of methods for formal reasoning about the correctness of real-time systems, in particular methods and tools to handle new classes of problems. In real-time systems the correctness of the system does not only depend on the order in which...... actions take place, but also the timing of the actions. The formal reasoning presented here is based on (extensions of) the model of timed automata and tools supporting this model, mainly UPPAAL. Real-time systems are often part of safety critical systems e.g. control systems for planes, trains......, or factories, though also everyday electronics as audio/video equipment and (mobile) phones are considered real-time systems. Often these systems are concurrent systems with a number of components interacting, and reasoning about such systems is notoriously difficult. However, since most of the systems...

  2. Toolkit for a systems analysis framework of the EU bioeconomy : overview of WP2 in the EU FP 7 SAT-BBE project: systems analysis tools framework for the EU Bio-Based Economy Strategy

    NARCIS (Netherlands)

    Leeuwen, van M.G.A.; Meijl, van J.C.M.; Smeets, E.M.W.

    2014-01-01

    The objective of the SAT-BBE project is to describe, monitor and model the bioeconomy part of the economic system, by the development of an appropriate conceptual toolkit. In WP1, the concepts of bioeconomy and non-bioeconomy sectors have been defined, the major interactions and feedback effects

  3. Enterprise KM System: IT based Tool for Nuclear Malaysia

    International Nuclear Information System (INIS)

    Mohamad Safuan Sulaiman; Siti Nurbahyah Hamdan; Mohd Dzul Aiman Aslan

    2014-01-01

    Implementation of right and suitable tool for enterprise Knowledge Management (KM) system to an organization is not an easy task. Everything needs to be taken into account before its implementation come true. One of them is to ensure full cooperation is given by the whole entire organization to succeed the knowledge sharing culture utilizing the tool. From selection of potential tools until the implementation and deployment strategies, these shall be thoroughly and carefully organized. A study of choosing the suitable tools and those strategies has been done in Nuclear Malaysia as resulted from Process Oriented Knowledge Management (POKM) project. As far as enterprise KM system is concerned, Microsoft Share Point technology is one of the potential tools in this context. This paper articulates approach and methodology of choosing the technology including its planning, deployment and implementation strategies. (author)

  4. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  5. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  6. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  7. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  8. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.

    2011-01-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  9. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  10. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  11. Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA’s Risk Management Analysis Tool and Associated Methods

    Science.gov (United States)

    2012-01-01

    clarifying misconceptions that the experts might have. Methods of incorporating group interaction into assessment include mathematically merging...Modeling Terrorism Risk to the Air Transportation System ics or chemistry underlying some of the processes. This would increase the likelihood that RMAT

  12. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  13. The database management system: A topic and a tool

    Science.gov (United States)

    Plummer, O. R.

    1984-01-01

    Data structures and data base management systems are common tools employed to deal with the administrative information of a university. An understanding of these topics is needed by a much wider audience, ranging from those interested in computer aided design and manufacturing to those using microcomputers. These tools are becoming increasingly valuable to academic programs as they develop comprehensive computer support systems. The wide use of these tools relies upon the relational data model as a foundation. Experience with the use of the IPAD RIM5.0 program is described.

  14. Communication and control tools, systems, and new dimensions

    CERN Document Server

    MacDougall, Robert; Cummings, Kevin

    2015-01-01

    Communication and Control: Tools, Systems, and New Dimensions advocates a systems view of human communication in a time of intelligent, learning machines. This edited collection sheds new light on things as mundane yet still profoundly consequential (and seemingly "low-tech") today as push buttons, pagers and telemarketing systems. Contributors also investigate aspects of "remote control" related to education, organizational design, artificial intelligence, cyberwarfa

  15. Choice and complexation of techniques and tools for assessment of NPP I and C systems safety

    International Nuclear Information System (INIS)

    Illiashenko, Oleg; Babeshko, Eugene

    2011-01-01

    There are a lot of techniques to analyze and assess reliability and safety of NPP Instrumentation and Control (I and C) systems (e.g. FMEA - Failure Modes and Effects Analysis and its modifications, FTA - Fault Tree Analysis, HAZOP - Hazard and Operability Analysis, RBD - Reliability Block Diagram, Markov Models, etc.) and quantity of tools based on these techniques is constantly increasing. Known ways of safety assessment, as well as problems of their choice and complexation are analyzed. Objective of the paper is the development of general 'technique of techniques choosing' and tool for support of such technique. The following criteria are used for analysis and comparison and their features are described: compliance to normative documents; experience of application in industry; methods used for assessment of system NPP I and C safety; tool architecture/framework; reporting; vendor support, etc. Comparative analysis results of existing T and T - Tools and Techniques for safety analysis are presented in matrix form ('Tools-Criterion') with example. Features of complexation of different safety assessment techniques (FMECA, FTA, RBD, Markov Models) are described. The proposed technique is implemented as special tool for decision-making. The proposed technique was used for development of RPC Radiy company standard CS 66. This guide contains requirements and procedures of FMECA analysis of developed and produced NPP I and C systems based on RADIY platform. (author)

  16. Film analysis systems and applications

    Energy Technology Data Exchange (ETDEWEB)

    Yonekura, Y.; Brill, A.B.

    1981-01-01

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool. (ERB)

  17. Film analysis systems and applications

    International Nuclear Information System (INIS)

    Yonekura, Y.; Brill, A.B.

    1981-01-01

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool

  18. Raman microspectroscopy: a noninvasive analysis tool for monitoring of collagen-containing extracellular matrix formation in a medium-throughput culture system.

    Science.gov (United States)

    Kunstar, Aliz; Otto, Cees; Karperien, Marcel; van Blitterswijk, Clemens; van Apeldoorn, Aart

    2011-07-01

    The three-dimensional environment is known to play an important role in promoting cell-matrix interactions. We have investigated the possibility of using Raman microspectroscopy--which has the great advantage of noninvasive sensing--for in vitro monitoring of extracellular matrix (ECM) formation in a medium-throughput pellet (3D) culture system with soft-litography, agarose-microwell arrays. Chondrocytes were seeded in the agarose microwells in basic or chondrocyte medium. After 3, 7, and 14 days of culture, samples were analyzed for ECM formation by Raman microspectroscopy, histology, and immunofluorescence. ECM formation in the chondrocyte medium-cultured samples was detected by histology and immunofluorescence, and also noninvasively by Raman microspectroscopy. The Raman band of collagen found at 937 cm(-1) can be used as a Raman marker for collagen-containing ECM formation over time in the chondrocyte pellets. This culture system can be implemented as a medium-throughput platform for Raman applications and screening microtissue formation, since with these agarose-microwell arrays relatively large numbers of cell pellets could be screened in a short time in situ, without having to transfer the pellets onto microscopic slides. Moreover, in this manner the culture system is suitable for long-term, real-time live-cell measurements.

  19. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  20. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  1. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  2. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1996-01-01

    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  3. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  4. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  5. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  6. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    Science.gov (United States)

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  7. ASOURCE: Source Term Analysis Tool for Advanced Fuel Cycle

    International Nuclear Information System (INIS)

    Cho, Dong Keun; Kook, Dong Hak; Choi, Jong Won; Choi, Heui Joo; Jeong, Jong Tae

    2012-01-01

    In 2007, the 3 rd Comprehensive Nuclear Energy Promotion Plan, passed at the 254 th meeting of the Atomic Energy Commission, was announced as an R and D action plan for the development of an advanced fuel cycle adopting a sodium-cooled fast reactor (SFR) in connection with a pyroprocess for a sustainable stable energy supply and a reduction in the amount of spent fuel (SF). It is expected that this fuel cycle can greatly reduce the SF inventory through a recycling process in which transuranics (TRU) and long-lived nuclides are burned in the SFR and cesium and strontium are disposed of after sufficient interim storage. For the success of the R and D plan, there are several issues related to the source term analysis. These are related with the following: (a) generation of inflow and outflow source terms of mixed SF in each process for the design of the pyroprocess facility, (b) source terms of mixed radwaste in a canister for the design of storage and disposal systems, (c) overall inventory estimation for TRU and long-lived nuclides for the design of the SFR, and (d) best estimate source terms for the practical design of the interim storage facility of SFs. A source term evaluation for a SF or radwaste with a single irradiation profile can be easily accomplished with the conventional computation tool. However, source term assessment for a batch of SFs or a mixture of radwastes generated from SFs with different irradiation profiles. A task that is essential to support the aforementioned activities is not possible with the conventional tool. Therefore, hybrid computing program for source term analysis to support the advanced fuel cycle was developed

  8. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability...

  9. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  10. Hair Analysis as a Diagnostic and Forensic Tool in a Social Support System for Families with Underage Children and Drug Abusing Parents: Four Year Experience

    Directory of Open Access Journals (Sweden)

    Fritz Pragst

    2015-12-01

    Full Text Available Between 2011 and 2014, 388 hair samples from children (age 1-14 years and 594 hair samples from parents/caregivers were analyzed for methadone, opiates, cocaine, amphetamines, ecstasy, cannabinoids and benzodiazepines by LC-MS/MS and GC-MS. Hair testing was preferentially performed for parents/ caregivers. Children were included in case of positive parents/caregivers results or in urgent cases after a court decision if the parents/caregivers declined the testing. A follow-up hair test was performed after 6-months (positive results or 12-months (negative results. Results: The children hair appeared to be a sensitive indicator of the handling of drugs in their environment and often showed a similar or the same drug profile as their parents. The annual statistics for the initial test of children were no drugs 22-33%, methadone 9-25%, heroin 7.5-28%, cocaine 46-57%, amphetamines and ecstasy 1.6-7.0%, cannabinoids 21-49% and benzodiazepines 0-5%. As a rule, concentrations in hair of younger children were higher than of their elder siblings. In severe cases and based on the entire information about the case, the children were immediately taken into custody, the family court was appealed, the children were admitted to children's homes or foster families, or the drug consuming adult was separated from the family and a withdrawal treatment was initiated. Follow-up tests showed an improvement in the situation of the children particularly in severe cases. Conclusion: Hair analysis proved to be a very efficient working instrument for social authorities in the systematic improvement of child-welfare in drug abusing environments.

  11. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  12. Information systems project management: methods, tools, and techniques

    OpenAIRE

    Mcmanus, John; Wood-Harper, Trevor

    2004-01-01

    Information Systems Project Management offers a clear and logical exposition of how to plan, organise and monitor projects effectively in order to deliver quality information systems within time, to budget and quality. This new book by John McManus and Trevor Wood-Harper is suitable for upper level undergraduates and postgraduates studying project management and Information Systems. Practising managers will also find it to be a valuable tool in their work. Managing information systems pro...

  13. Topics in expert system design methodologies and tools

    CERN Document Server

    Tasso, C

    1989-01-01

    Expert Systems are so far the most promising achievement of artificial intelligence research. Decision making, planning, design, control, supervision and diagnosis are areas where they are showing great potential. However, the establishment of expert system technology and its actual industrial impact are still limited by the lack of a sound, general and reliable design and construction methodology.This book has a dual purpose: to offer concrete guidelines and tools to the designers of expert systems, and to promote basic and applied research on methodologies and tools. It is a coordinated coll

  14. The ALICE analysis train system

    CERN Document Server

    Zimmermann, Markus

    2015-01-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  15. Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) : An Open-source Tool to Compute Climate Statistic GIS Layers from Argonne Climate Modeling Results

    Science.gov (United States)

    2017-08-01

    DRAFT Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) An Open- source Tool to Compute Climate Statistic GIS...Unidata 2017). This report includes: 1. User documentation for GATOR.py, an open- source Python tool that uses Environmental Systems Research...of climate change. GATOR can compute many additional statistics by using new combinations of the existing input parameters. The code is open source

  16. Tools and technologies for expert systems: A human factors perspective

    Science.gov (United States)

    Rajaram, Navaratna S.

    1987-01-01

    It is widely recognized that technologies based on artificial intelligence (AI), especially expert systems, can make significant contributions to the productivity and effectiveness of operations of information and knowledge intensive organizations such as NASA. At the same time, these being relatively new technologies, there is the problem of transfering technology to key personnel of such organizations. The problems of examining the potential of expert systems and of technology transfer is addressed in the context of human factors applications. One of the topics of interest was the investigation of the potential use of expert system building tools, particularly NEXPERT as a technology transfer medium. Two basic conclusions were reached in this regard. First, NEXPERT is an excellent tool for rapid prototyping of experimental expert systems, but not ideal as a delivery vehicle. Therefore, it is not a substitute for general purpose system implementation languages such a LISP or C. This assertion probably holds for nearly all such tools on the market today. Second, an effective technology transfer mechanism is to formulate and implement expert systems for problems which members of the organization in question can relate to. For this purpose, the LIghting EnGineering Expert (LIEGE) was implemented using NEXPERT as the tool for technology transfer and to illustrate the value of expert systems to the activities of the Man-System Division.

  17. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  18. SYSTEMIC ASSESSMENT [SA] AS A TOOL TO ASSESS STUDENT ...

    African Journals Online (AJOL)

    Temechegn

    evaluate, measures the students' ability to correlate between concepts with reduced working memory load. Systemic Assessment Questions (SAQ, s) are the building unites of the systemic assessment. In this issue we use SA as a tool to assess the student achievement in inorganic chemistry by taking sodium chemistry as a ...

  19. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  20. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    Science.gov (United States)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  1. Fiscal system analysis - contractual systems

    International Nuclear Information System (INIS)

    Kaiser, M.J.

    2006-01-01

    Production sharing contracts are one of the most popular forms of contractual system used in petroleum agreements around the world, but the manner in which the fiscal terms and contract parameters impact system measures is complicated and not well understood. The purpose of this paper is to quantify the influence of private and market uncertainty in contractual fiscal systems. A meta-modelling approach is employed that couples the results of a simulation model with regression analysis to construct numerical functionals that quantify the fiscal regime. Relationships are derived that specify how the present value, rate of return, and take statistics vary as a function of the system parameters. The deepwater Girassol field development in Angola is taken as a case study. (author)

  2. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  3. IBZM tool: a fully automated expert system for the evaluation of IBZM SPECT studies

    International Nuclear Information System (INIS)

    Buchert, Ralph; Wilke, Florian; Martin, Brigitte; Borczyskowski, Daniel von; Mester, Janos; Brenner, Winfried; Clausen, Malte; Berding, Georg

    2006-01-01

    Visual reading of [ 123 I]IBZM SPECT scans depends on the experience of the interpreter. Therefore, semi-quantification of striatal IBZM uptake is commonly considered mandatory. However, semi-quantification is time consuming and prone to error, particularly if the volumes of interest (VOIs) are positioned manually. Therefore, the present paper proposes a new software tool (''IBZM tool'') for fully automated and standardised processing, evaluation and documentation of [ 123 I]IBZM SPECT scans. The IBZM tool is an easy-to-use SPM toolbox. It includes automated procedures for realignment and summation of multiple frames (motion correction), stereotactic normalisation, scaling, VOI analysis of striatum-to-reference ratio R, classification of R and standardised display. In order to evaluate the tool, which was developed at the University of Hamburg, the tool was transferred to the University of Hannover. There it was applied to 27 well-documented subjects: eight patients with multi-system atrophy (MSA), 12 patients with Parkinson's disease (PD) and seven controls. The IBZM tool was compared with manual VOI analysis. The sensitivity and specificity of the IBZM tool for the differentiation of the MSA subjects from the controls were 100% and 86%, respectively. The IBZM tool provided improved statistical power compared with manual VOI analysis. The IBZM tool is an expert system for the detection of reduced striatal D 2 availability on [ 123 I]IBZM SPECT scans. The standardised documentation supports visual and semi-quantitative evaluation, and it is useful for presenting the findings to the referring physician. The IBZM tool has the potential for widespread use, since it appears to be fairly independent of the performance characteristics of the particular SPECT system used. The tool is available free of charge. (orig.)

  4. IBZM tool: a fully automated expert system for the evaluation of IBZM SPECT studies

    Energy Technology Data Exchange (ETDEWEB)

    Buchert, Ralph; Wilke, Florian; Martin, Brigitte; Borczyskowski, Daniel von; Mester, Janos; Brenner, Winfried; Clausen, Malte [University Medical Center Hamburg-Eppendorf, Department of Nuclear Medicine, Hamburg (Germany); Berding, Georg [University School of Medicine Hannover, Department of Nuclear Medicine, Hannover (Germany)

    2006-09-15

    Visual reading of [{sup 123}I]IBZM SPECT scans depends on the experience of the interpreter. Therefore, semi-quantification of striatal IBZM uptake is commonly considered mandatory. However, semi-quantification is time consuming and prone to error, particularly if the volumes of interest (VOIs) are positioned manually. Therefore, the present paper proposes a new software tool (''IBZM tool'') for fully automated and standardised processing, evaluation and documentation of [{sup 123}I]IBZM SPECT scans. The IBZM tool is an easy-to-use SPM toolbox. It includes automated procedures for realignment and summation of multiple frames (motion correction), stereotactic normalisation, scaling, VOI analysis of striatum-to-reference ratio R, classification of R and standardised display. In order to evaluate the tool, which was developed at the University of Hamburg, the tool was transferred to the University of Hannover. There it was applied to 27 well-documented subjects: eight patients with multi-system atrophy (MSA), 12 patients with Parkinson's disease (PD) and seven controls. The IBZM tool was compared with manual VOI analysis. The sensitivity and specificity of the IBZM tool for the differentiation of the MSA subjects from the controls were 100% and 86%, respectively. The IBZM tool provided improved statistical power compared with manual VOI analysis. The IBZM tool is an expert system for the detection of reduced striatal D{sub 2} availability on [{sup 123}I]IBZM SPECT scans. The standardised documentation supports visual and semi-quantitative evaluation, and it is useful for presenting the findings to the referring physician. The IBZM tool has the potential for widespread use, since it appears to be fairly independent of the performance characteristics of the particular SPECT system used. The tool is available free of charge. (orig.)

  5. Remote controlled tool systems for nuclear sites have subsea applications

    International Nuclear Information System (INIS)

    Bath, B.; Yemington, C.; Kuhta, B.

    1995-10-01

    Remotely operated underwater tool systems designed to operate in Nuclear Fuel Storage Basins can be applied to deep water, subsea oilfield applications. Spent nuclear fuel rods re stored underwater in large indoor swimming pool-like facilities where the water cover shields the workers from the radiation. This paper describes three specialized tooling systems that were designed and built by Sonsub for work at the Department of Energy's Hanford site. The Door Seal Tool removed an existing seal system, cleaned a 20 ft. tall, carbon steel, underwater hatch and installed a new stainless steel gasket surface with underwater epoxy. The Concrete Sampling Tool was built to take core samples from the vertical, concrete walls of the basins. The tool has three hydraulic drills with proprietary hollow core drill bits to cut and retrieve the concrete samples. The Rack Saw remotely attached itself to a structure, cut a variety of steel shapes and pipes, and retained the cut pieces for retrieval. All of these systems are remotely operated with onboard video cameras and debris collection systems. The methods and equipment proven in this application are available to refurbish sealing surfaces and to drill or sample concrete in offshore oil field applications

  6. Eating tools in hand activate the brain systems for eating action: a transcranial magnetic stimulation study.

    Science.gov (United States)

    Yamaguchi, Kaori; Nakamura, Kimihiro; Oga, Tatsuhide; Nakajima, Yasoichi

    2014-07-01

    There is increasing neuroimaging evidence suggesting that visually presented tools automatically activate the human sensorimotor system coding learned motor actions relevant to the visual stimuli. Such crossmodal activation may reflect a general functional property of the human motor memory and thus can be operating in other, non-limb effector organs, such as the orofacial system involved in eating. In the present study, we predicted that somatosensory signals produced by eating tools in hand covertly activate the neuromuscular systems involved in eating action. In Experiments 1 and 2, we measured motor evoked response (MEP) of the masseter muscle in normal humans to examine the possible impact of tools in hand (chopsticks and scissors) on the neuromuscular systems during the observation of food stimuli. We found that eating tools (chopsticks) enhanced the masseter MEPs more greatly than other tools (scissors) during the visual recognition of food, although this covert change in motor excitability was not detectable at the behavioral level. In Experiment 3, we further observed that chopsticks overall increased MEPs more greatly than scissors and this tool-driven increase of MEPs was greater when participants viewed food stimuli than when they viewed non-food stimuli. A joint analysis of the three experiments confirmed a significant impact of eating tools on the masseter MEPs during food recognition. Taken together, these results suggest that eating tools in hand exert a category-specific impact on the neuromuscular system for eating. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    Science.gov (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  8. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  9. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    Science.gov (United States)

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  10. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate the quanti......SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...

  11. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  12. Free and open source simulation tools for the design of power processing units for photovoltaic systems

    Directory of Open Access Journals (Sweden)

    Sergio Morales-Hernández

    2015-06-01

    Full Text Available Renewable energy sources, including solar photovoltaic, require electronic circuits that serve as interface between the transducer device and the device or system that uses energy. Moreover, the energy efficiency and the cost of the system can be compromised if such electronic circuit is not designed properly. Given that the electrical characteristics of the photovoltaic devices are nonlinear and that the most efficient electronic circuits for power processing are naturally discontinuous, a detailed dynamic analysis to optimize the design is required. This analysis should be supported by computer simulation tools. In this paper a comparison between two software tools for dynamic system simulation is performed to determinate its usefulness in the design process of photovoltaic systems, mainly in what corresponds to the power processing units. Using as a case of study a photovoltaic system for battery charging it was determined that Scicoslab tool was the most suitable.

  13. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  14. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design.......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  15. A Automated Tool for Supporting FMEAs of Digital Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.; Lehner, J.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach, an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.

  16. Tools for T-RFLP data analysis using Excel.

    Science.gov (United States)

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  17. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    . Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  18. System Performance and Layered Analysis Tool

    Science.gov (United States)

    2011-12-01

    end points are determined by the gate positions. The path between each start and end is modeled using a Brownian bridge ( Karlin and Tay- lor, 1981...Piscataway, New Jersey: Institute of Electrical and Electronics Engineers, Inc. Karlin , S. and H.M. Taylor. 1981. A Second Course in Stochastic Processes

  19. Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool

    Science.gov (United States)

    Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.

    2011-01-01

    This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…

  20. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  1. Living Color Frame System: PC graphics tool for data visualization

    Science.gov (United States)

    Truong, Long V.

    1993-01-01

    Living Color Frame System (LCFS) is a personal computer software tool for generating real-time graphics applications. It is highly applicable for a wide range of data visualization in virtual environment applications. Engineers often use computer graphics to enhance the interpretation of data under observation. These graphics become more complicated when 'run time' animations are required, such as found in many typical modern artificial intelligence and expert systems. Living Color Frame System solves many of these real-time graphics problems.

  2. DYNAMIC PROGRAMMING – EFFICIENT TOOL FOR POWER SYSTEM EXPANSION PLANNING

    Directory of Open Access Journals (Sweden)

    SIMO A.

    2015-03-01

    Full Text Available The paper isfocusing on dynamic programming use for power system expansion planning (EP – transmission network (TNEP and distribution network (DNEP. The EP problem has been approached from the retrospective and prospective point of view. To achieve this goal, the authors are developing two software-tools in Matlab environment. Two techniques have been tackled: particle swarm optimization (PSO and genetic algorithms (GA. The case study refers to Test 25 buses test power system developed within the Power Systems Department.

  3. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    Science.gov (United States)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  4. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted....

  5. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  6. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  7. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  8. An Online Image Analysis Tool for Science Education

    Science.gov (United States)

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  9. Tools of Audience Analysis in Contemporary Political Campaigns.

    Science.gov (United States)

    Friedenberg, Robert V.

    This paper examines two basic tools of audience analysis as they are used in contemporary political campaingning: public opinion polls and interpretations of voter statistics. The raw data used in the statistical analyses reported in this investigation come from national polls and voter statistics provided to Republican candidates running in local…

  10. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    ABSTRACT: Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical. Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a ...

  11. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  12. Rapid Benefit Indicators (RBI) Spatial Analysis Tools - Manual

    Science.gov (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  13. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  14. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  15. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  16. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    Science.gov (United States)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  17. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  18. Urban solid waste collection system using mathematical modelling and tools of geographic information systems.

    Science.gov (United States)

    Arribas, Claudia Andrea; Blazquez, Carola Alejandra; Lamas, Alejandro

    2010-04-01

    A poorly designed urban solid waste collection system has an enormous impact on labour, operational and transport costs, and on society in general due to road contamination and negative effects on public health and the environment. This study proposes a methodology for designing an urban solid waste collection system. This methodology uses combinatorial optimisation and integer programing, and GIS tools to minimise collection time, and operational and transport costs while enhancing the current solid waste collection system. This methodology establishes feasible collection routes, determines an adequate vehicle fleet size and presents a comparative cost and sensitivity analysis of the results. The implementation of this methodology in a study case of a zone in Santiago yields significant cost savings in the total collection system.

  19. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.

    2015-01-01

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  20. A study of an intelligent FME system for SFCR tools

    International Nuclear Information System (INIS)

    Hassan, H.A.

    2008-01-01

    In the nuclear field, the accurate identification, tracking and history documentation of every nuclear tool, equipment or component is a key to safety, operational and maintenance excellence, and security of the nuclear reactor. This paper offers a study of the possible development of the present Foreign Material Exclusion (FME) system using an Intelligent Nuclear Tools Identification System, (INTIS), that was created and customized for the Single Fuel Channel Replacement (SFCR) Tools. The conceptual design of the INTIS was presented comparing the current and the proposed systems in terms of the time, the cost and the radiation doses received by the employees during the SFCR maintenance jobs. A model was created to help better understand and analyze the effects of deployment of the INTIS on the time, performance, accuracy, received dose and finally the total cost. The model may be also extended to solve other nuclear applications problems. The INTIS is based on Radio Frequency Identification (RFID) Smart Tags which are networked with readers and service computers. The System software was designed to communicate with the network to provide the coordinate information for any component at any time. It also allows digital signatures for use and/or approval to use the components and automatically updates their Data Base Management Systems (DBMS) history in terms of the person performing the job, the time period and date of use. This feature together with the information of part's life span could be used in the planning process for the predictive and preventive maintenance. As a case study, the model was applied to a pilot project for SFCR Tools FME. The INTIS automatically records all the tools to be used inside the vault and make real time tracking of any misplaced tool. It also automatically performs a continuous check of all tools, sending an alarm if any of the tools was left inside the vault after the job is done. Finally, a discussion of the results of the system

  1. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    Science.gov (United States)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  2. VisIt: Interactive Parallel Visualization and Graphical Analysis Tool

    Science.gov (United States)

    Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)

    2011-03-01

    VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

  3. IT-tools for Mechatronic System Engineering and Design

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben; Andersen, T. O.

    2003-01-01

    and manufacturing IT- concept feasible and commercially attractive. An IT-tool concept for modelling, simulation and design of mechatronic products and systems is proposed in this paper. It built on results from a Danish mechatronic research program focusing on intelligent motion control as well as results from...... for real-time experimentation, evaluation and validation of control laws and algorithms....

  4. The Predictive Validity of the Early Warning System Tool

    Science.gov (United States)

    Johnson, Evelyn; Semmelroth, Carrie

    2010-01-01

    The Early Warning System is a tool developed by the National High School Center to collect data on indicators including attendance, grade point average, course failures, and credits earned. These indicators have been found to be highly predictive of a student's likelihood of dropping out of high school in large, urban areas. The Early Warning…

  5. Systemic Assessment as a new tool for assessing students learning ...

    African Journals Online (AJOL)

    Systemic Assessment [SA] has been shown to be highly effective new tool in raising the level of students academic achievements, improve their ability to learn by enhancing the process of teaching and learning, and converts students from surface to deep learning. It also allow teacher to monitor students learning ...

  6. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  7. Faculty Usage of Library Tools in a Learning Management System

    Science.gov (United States)

    Leeder, Chris; Lonn, Steven

    2014-01-01

    To better understand faculty attitudes and practices regarding usage of library-specific tools and roles in a university learning management system, log data for a period of three semesters was analyzed. Academic departments with highest rates of usage were identified, and faculty users and nonusers within those departments were surveyed regarding…

  8. Analysis for Non-Traditional Security Challenges: Methods and Tools

    Science.gov (United States)

    2006-11-20

    Course of Action COCOM Combatant Commander COI Community of Interest CPB Cultural Preparation of the Battlefield CPM Critical Path Method DARPA Defense...Secretary of Defense (Program Analysis and Evaluation) PACOM United States Pacific Command PERT Program Evaluation Review Technique PMESII Political...Availability .U .aid5.ss of each pos f1 00" tool.Ow, pomty al wi c ro.sd. J Metod F pl Tools I I LL J L L L Dvnmuoc jI.1n~stl osb M00io ~g~osgdowr01Vfl) X x

  9. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  10. Modulated Tool-Path (MTP) Chip Breaking System

    Energy Technology Data Exchange (ETDEWEB)

    Graham, K. B.

    2010-04-01

    The Modulated Tool-Path (MTP) Chip Breaking System produces user-selectable chip lengths and workpiece finishes and is compatible with any material, workpiece shape, and depth of cut. The MTP chip breaking system consistently creates the desired size of chips regardless of workpiece size, shape, or material, and the machine operator does not need to make any adjustments during the machining operation. The system's programmer configures the part program that commands the machine tool to move in a specific fashion to deliver the desired part size, shape, chip length, and workpiece surface finish. The MTP chip breaking system helps manufacturers avoid the detrimental effects of continuous chips, including expensive repair costs, delivery delays, and hazards to personnel.

  11. The Development of a Humanitarian Health Ethics Analysis Tool.

    Science.gov (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  12. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  13. Methods and tools for profiling and control of distributed systems

    Science.gov (United States)

    Sukharev, R.; Lukyanchikov, O.; Nikulchev, E.; Biryukov, D.; Ryadchikov, I.

    2018-02-01

    This article is devoted to the topic of profiling and control of distributed systems. Distributed systems have a complex architecture, applications are distributed among various computing nodes, and many network operations are performed. Therefore, today it is important to develop methods and tools for profiling distributed systems. The article analyzes and standardizes methods for profiling distributed systems that focus on simulation to conduct experiments and build a graph model of the system. The theory of queueing networks is used for simulation modeling of distributed systems, receiving and processing user requests. To automate the above method of profiling distributed systems the software application was developed with a modular structure and similar to a SCADA-system.

  14. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm .

  15. SNAP: a tool for nuclear physical protection system modeling

    International Nuclear Information System (INIS)

    Engi, D.; Grant, F.H. III.

    1979-10-01

    Nuclear safeguards systems are concerned, in part, with the physical protection of nuclear materials. The function of a physical protection system is to define the facility against adversary activities which could lead to theft of nuclear material or sabotage resulting in a radiological release. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of physical protection system analysis. This paper describes a detailed application of SNAP to a hypothetical nuclear facility

  16. A Comparative Analysis of Cockpit Display Development Tools

    National Research Council Canada - National Science Library

    Gebhardt, Matthew

    2000-01-01

    ..., Virtual Application Prototyping System (VAPS) and Display Editor. The comparison exploits the analysis framework establishing the advantages and disadvantages of the three software development suites...

  17. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    Stauch David

    2008-01-01

    Full Text Available Abstract We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  18. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    David Stauch

    2008-11-01

    Full Text Available We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  19. Folding System for the Clothes by a Robot and Tools

    OpenAIRE

    大澤, 文明; 関, 啓明; 神谷, 好承

    2004-01-01

    The works of a home robot has the laundering. The purpose of this study is to find a means of folding of the clothes and store the clothes in a drawer by a home robot. Because the shape of cloth tends to change in various ways depending on the situation, it is difficult for robot hands to fold the clothes. In this paper, we propose a realistic folding system for the clothes by a robot and tools. The function of a tool is folding the clothes in half by inserting the clothes using two plates. T...

  20. AUTOMATIC TOOL-CHANGING WITHIN THE RECONFIGURABLE MANUFACTURING SYSTEMS PARADIGM

    Directory of Open Access Journals (Sweden)

    J.E.T. Collins

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Reconfigurable manufacturing systems were developed as a proposed solution to the varying market and customer requirements present in today’s global market. The systems are designed to offer adaptability in machining functions and processes. This adaptive capability requires access to a selection of tools. The development of reconfigurable manufacturing systems has mainly been focused on the machine tools themselves. Methods of supplying tools to these machines need to be researched. This paper does so, presenting a tool-changing unit that offers a solution to this need. It then discusses the enabling technologies that would allow for automatic integration and diagnostic abilities of the unit.

    AFRIKAANSE OPSOMMING: Herkonfigureerbare vervaardingstelsels is ontwikkel as ’n voorgestelde oplossing vir die varierende mark- en klantbehoeftes in die hedendaagse globale mark. Die stelsels is ontwikkel om aanpasbaarheid te bied ten opsigte van masjineringsfunksies en –prosesse. Hierdie aanpasbare vermoëns vereis egter toegang tot ‘n verskeidenheid van gereedskapstukke. Die ontwikkeling van herkonfigureerbare vervaardigingstelsels het egter hoofsaaklik gefokus op die gereedskapstukke. Die wyse waarop hierdie gereedskapstukke beskikbaar gestel word aan die masjinerie moet egter nagevors word. Hierdie artikel doen juis dit en stel ‘n eenheid voor vir die ruiling van gereedskapstukke. Voorts word die tegnologieë bespreek wat automatiese integrasie moontlik maak en diagnostiese vermoëns verskaf.

  1. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  2. Is motion analysis a valid tool for assessing laparoscopic skill?

    Science.gov (United States)

    Mason, John D; Ansell, James; Warren, Neil; Torkington, Jared

    2013-05-01

    The use of simulation for laparoscopic training has led to the development of objective tools for skills assessment. Motion analysis represents one area of focus. This study was designed to assess the evidence for the use of motion analysis as a valid tool for laparoscopic skills assessment. Embase, MEDLINE and PubMed were searched using the following domains: (1) motion analysis, (2) validation and (3) laparoscopy. Studies investigating motion analysis as a tool for assessment of laparoscopic skill in general surgery were included. Common endpoints in motion analysis metrics were compared between studies according to a modified form of the Oxford Centre for Evidence-Based Medicine levels of evidence and recommendation. Thirteen studies were included from 2,039 initial papers. Twelve (92.3 %) reported the construct validity of motion analysis across a range of laparoscopic tasks. Of these 12, 5 (41.7 %) evaluated the ProMIS Augmented Reality Simulator, 3 (25 %) the Imperial College Surgical Assessment Device (ICSAD), 2 (16.7 %) the Hiroshima University Endoscopic Surgical Assessment Device (HUESAD), 1 (8.33 %) the Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) and 1 (8.33 %) the Robotic and Video Motion Analysis Software (ROVIMAS). Face validity was reported by 1 (7.7 %) study each for ADEPT and ICSAD. Concurrent validity was reported by 1 (7.7 %) study each for ADEPT, ICSAD and ProMIS. There was no evidence for predictive validity. Evidence exists to validate motion analysis for use in laparoscopic skills assessment. Valid parameters are time taken, path length and number of hand movements. Future work should concentrate on the conversion of motion data into competency-based scores for trainee feedback.

  3. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  4. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    Science.gov (United States)

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-02

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  6. National Ignition Facility Control and Information System Operational Tools

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, C D; Beeler, R G; Bowers, G A; Carey, R W; Fisher, J M; Foxworthy, C B; Frazier, T M; Mathisen, D G; Lagin, L J; Rhodes, J J; Shaw, M J

    2009-10-08

    The National Ignition Facility (NIF) in Livermore, California, is the world's highest-energy laser fusion system and one of the premier large scale scientific projects in the United States. The system is designed to setup and fire a laser shot to a fusion ignition or high energy density target at rates up to a shot every 4 hours. NIF has 192 laser beams delivering up to 1.8 MJ of energy to a {approx}2 mm target that is planned to produce >100 billion atm of pressure and temperatures of >100 million degrees centigrade. NIF is housed in a ten-story building footprint the size of three football fields as shown in Fig. 1. Commissioning was recently completed and NIF will be formally dedicated at Lawrence Livermore National Laboratory on May 29, 2009. The control system has 60,000 hardware controls points and employs 2 million lines of control system code. The control room has highly automated equipment setup prior to firing laser system shots. This automation has a data driven implementation that is conducive to dynamic modification and optimization depending on the shot goals defined by the end user experimenters. NIF has extensive facility machine history and infrastructure maintenance workflow tools both under development and deployed. An extensive operational tools suite has been developed to support facility operations including experimental shot setup, machine readiness, machine health and safety, and machine history. The following paragraphs discuss the current state and future upgrades to these four categories of operational tools.

  7. High Accuracy Nonlinear Control and Estimation for Machine Tool Systems

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios

    Component mass production has been the backbone of industry since the second industrial revolution, and machine tools are producing parts of widely varying size and design complexity. The ever-increasing level of automation in modern manufacturing processes necessitates the use of more...... sophisticated machine tool systems that are adaptable to different workspace conditions, while at the same time being able to maintain very narrow workpiece tolerances. The main topic of this thesis is to suggest control methods that can maintain required manufacturing tolerances, despite moderate wear and tear....... The purpose is to ensure that full accuracy is maintained between service intervals and to advice when overhaul is needed. The thesis argues that quality of manufactured components is directly related to the positioning accuracy of the machine tool axes, and it shows which low level control architectures...

  8. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  9. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  10. Modern tools to evaluate and optimize fire protection systems

    International Nuclear Information System (INIS)

    Alvares, N.J.; Hasegawa, H.K.

    1980-01-01

    Modern techniques, such as fault tree analysis, can be used to obtain engineering descriptions of specific fire protection systems. The analysis allows establishment of an optimum level of fire protection, and evaluates the level of protection provided by various systems. A prime example: the application to fusion energy experiments

  11. Systems Engineering-Based Tool for Identifying Critical Research Systems

    Science.gov (United States)

    Abbott, Rodman P.; Stracener, Jerrell

    2016-01-01

    This study investigates the relationship between the designated research project system independent variables of Labor, Travel, Equipment, and Contract total annual costs and the dependent variables of both the associated matching research project total annual academic publication output and thesis/dissertation number output. The Mahalanobis…

  12. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  13. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  14. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  15. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  16. Advances in geospatial analysis platforms and tools: Creating space for differentiated policy and investment responses

    CSIR Research Space (South Africa)

    Maritz, Johan

    2010-09-01

    Full Text Available Over the last 5 years a set of incremental advances within geospatial analysis platforms and tools developed by the CSIR's Planning Support Systems in collaboration with key stakeholders such as The Presidency, enabled a more nuanced regional level...

  17. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  18. Overview and Demonstration of USEPA’s Risk-Informed Materials Management (RIMM) Tool System

    Science.gov (United States)

    The Risk-Informed Materials Management (RIMM) Tool System is a data gathering and analysis platform for conducting material disposal and beneficial use assessments. Users can evaluate risks to human and ecological receptors associated with exposures to organic and inorganic chemi...

  19. Controlling open quantum systems: Tools, achievements, and limitations

    OpenAIRE

    Koch, Christiane P.

    2016-01-01

    The advent of quantum devices, which exploit the two essential elements of quantum physics, coherence and entanglement, has sparked renewed interest in the control of open quantum systems. Successful implementations face the challenge to preserve the relevant nonclassical features at the level of device operation. A major obstacle is decoherence which is caused by interaction with the environment. Optimal control theory is a tool that can be used to identify control strategies in the presence...

  20. Retroviruses as tools to study the immune system.

    Science.gov (United States)

    Lois, C; Refaeli, Y; Qin, X F; Van Parijs, L

    2001-08-01

    Retrovirus-based vectors provide an efficient means to introduce and express genes in cells of the immune system and have become a popular tool to study immune function. They are easy to manipulate and provide stable, long-term gene expression because they integrate into the genome. Current retroviral vectors do have limitations that affect their usefulness in certain applications. However, recent advances suggest a number of ways in which these vectors might be improved to extend their utility in immunological research.

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... attacks, and attacks launched by insiders. Finally, the perspectives for the application of the analysis techniques are discussed, thereby, coming a small step closer to providing developers with easy- to-use tools for validating the security of networking applications....

  2. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  3. Knowledge-based systems and interactive graphics for reactor control using the Automated Reasoning Tool(ART) system

    International Nuclear Information System (INIS)

    Ragheb, M.; Clayton, B.; Davies, P.

    1987-01-01

    The use of Knowledge-Based systems and advanced graphic concepts are described using the Automated Reasoning Tool (ART) for a model nuclear plant system. Through the sue of asynchronous graphic input/output, the user is allowed to communicate through a graphical display to a Production-Rule Analysis System modelling the plant while its rules are actively being fired. The user changes the status of system components by pointing at them on the system configuration display with a mouse cursor and clicking one of the buttons on the mouse. The Production-Rule Analysis System accepts the new input and immediately displays its diagnosis of the system state and any associated recommendations as to the appropriate course of action. This approach offers a distinct advantage over typing the components statuses in response to queries by a conventional Production-Rule Analysis system. Moreover, two effective ways of communication between man and machine are combined

  4. Digital Systems Analysis

    Science.gov (United States)

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  5. VENTILATION TECHNOLOGY SYSTEMS ANALYSIS

    Science.gov (United States)

    The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...

  6. Database tools for enhanced analysis of TMX-U data. Revision 1

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  7. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  8. Risk analysis tools for force protection and infrastructure/asset protection

    International Nuclear Information System (INIS)

    Jaeger, C.D.; Duggan, R.A.; Paulus, W.K.

    1998-01-01

    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework

  9. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  10. Technique for Increasing Accuracy of Positioning System of Machine Tools

    Directory of Open Access Journals (Sweden)

    Sh. Ji

    2014-01-01

    Full Text Available The aim of research is to improve the accuracy of positioning and processing system using a technique for optimization of pressure diagrams of guides in machine tools. The machining quality is directly related to its accuracy, which characterizes an impact degree of various errors of machines. The accuracy of the positioning system is one of the most significant machining characteristics, which allow accuracy evaluation of processed parts.The literature describes that the working area of the machine layout is rather informative to characterize the effect of the positioning system on the macro-geometry of the part surfaces to be processed. To enhance the static accuracy of the studied machine, in principle, two groups of measures are possible. One of them points toward a decrease of the cutting force component, which overturns the slider moments. Another group of measures is related to the changing sizes of the guide facets, which may lead to their profile change.The study was based on mathematical modeling and optimization of the cutting zone coordinates. And we find the formula to determine the surface pressure of the guides. The selected parameters of optimization are vectors of the cutting force and values of slides and guides. Obtained results show that a technique for optimization of coordinates in the cutting zone was necessary to increase a processing accuracy.The research has established that to define the optimal coordinates of the cutting zone we have to change the sizes of slides, value and coordinates of applied forces, reaching the pressure equalization and improving the accuracy of positioning system of machine tools. In different points of the workspace a vector of forces is applied, pressure diagrams are found, which take into account the changes in the parameters of positioning system, and the pressure diagram equalization to provide the most accuracy of machine tools is achieved.

  11. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  12. TTCScope - An analysis tool for the Trigger, Timing and Control signals of the LHC experiments

    OpenAIRE

    Moosavi, P; Ohm, C; Pauly, T

    2013-01-01

    This document describes a scope-based signal analysis tool for the Trigger, Timing and Control system of the LHC experiments. The TTC signal is digitized and read out using a digital sampling oscilloscope, and then analyzed in software. From the sampled signal, the BC clock is recovered along with the signal contents: level-1 accepts, commands, and trigger types. Two common use-cases in the ATLAS experiment are addressed: analysis of TTC signals and calibration of TTC crates. The latter inclu...

  13. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    Science.gov (United States)

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  14. A Data Management System for International Space Station Simulation Tools

    Science.gov (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  15. Reliability review of the remote tool delivery system locomotor

    Energy Technology Data Exchange (ETDEWEB)

    Chesser, J.B.

    1999-04-01

    The locomotor being built by RedZone Robotics is designed to serve as a remote tool delivery (RID) system for waste retrieval, tank cleaning, viewing, and inspection inside the high-level waste tanks 8D-1 and 8D-2 at West Valley Nuclear Services (WVNS). The RTD systm is to be deployed through a tank riser. The locomotor portion of the RTD system is designed to be inserted into the tank and is to be capable of moving around the tank by supporting itself and moving on the tank internal structural columns. The locomotor will serve as a mounting platform for a dexterous manipulator arm. The complete RTD system consists of the locomotor, dexterous manipulator arm, cameras, lights, cables, hoses, cable/hose management system, power supply, and operator control station.

  16. Ancillary services analysis of an offshore wind farm cluster-technical integration steps of a simulation tool

    OpenAIRE

    Hennig, Tobias; Löwer, Lothar; Faiella, Luis Mariano; Stock, Sebastian; Jansen, Malte; Hofmann, Lutz; Rohrig, Kurt

    2014-01-01

    In this publication, the authors present methodology and example results for the analysis of ancillary services of an offshore wind farm cluster and its electrical power system. Thereby the operation tool Wind Cluster Management System (WCMS) is used as simulation tool to evaluate certain planning scenarios. Emphasis is made on two topics: 1) the integration of high voltage direct current (HVDC) technology to the WCMS, 2) the ancillary service analysis. As examples, voltage source converter b...

  17. Actinide isotopic analysis systems

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Ruhter, W.D.; Gunnink, R.

    1990-01-01

    This manual provides instructions and procedures for using the Lawrence Livermore National Laboratory's two-detector actinide isotope analysis system to measure plutonium samples with other possible actinides (including uranium, americium, and neptunium) by gamma-ray spectrometry. The computer program that controls the system and analyzes the gamma-ray spectral data is driven by a menu of one-, two-, or three-letter options chosen by the operator. Provided in this manual are descriptions of these options and their functions, plus detailed instructions (operator dialog) for choosing among the options. Also provided are general instructions for calibrating the actinide isotropic analysis system and for monitoring its performance. The inventory measurement of a sample's total plutonium and other actinides content is determined by two nondestructive measurements. One is a calorimetry measurement of the sample's heat or power output, and the other is a gamma-ray spectrometry measurement of its relative isotopic abundances. The isotopic measurements needed to interpret the observed calorimetric power measurement are the relative abundances of various plutonium and uranium isotopes and americium-241. The actinide analysis system carries out these measurements. 8 figs

  18. Tabletop exercise as a tool of evaluating physical protection system

    International Nuclear Information System (INIS)

    Matsuzawa, Reina

    2014-01-01

    Evaluation of designed and implemented physical protection system (PPS) is essential for ensuring the effectiveness of PPS. In Japan, nuclear facility operators, which are required to assess performance of PPS, have conducted performance test of PPS element and periodical trainings as well as annual PPS exercise with relevant agencies. In addition to these practical or field efforts, non-field tool for evaluating PPS effectiveness such as tabletop exercise (TTX) can be utilized as it is applied in the USA and other countries. This paper discusses the potential advantage of TTX as an evaluation tool of PPS effectiveness, looking at the characterizations of TTX in comparison to field evaluations, and potential cases where operator would get benefit from TTX especially. (author)

  19. Network systems security analysis

    Science.gov (United States)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  20. Architecture for the Next Generation System Management Tools

    Energy Technology Data Exchange (ETDEWEB)

    Gallard, Jerome [INRIA/IRISA; Lebre, I Adrien [INRIA/IRISA; Morin, Christine [INRIA/IRISA; Naughton, III, Thomas J [ORNL; Scott, Stephen L [ORNL; Vallee, Geoffroy R [ORNL

    2011-01-01

    To get more results or greater accuracy, computational scientists execute their applications on distributed computing platforms such as Clusters, Grids and Clouds. These platforms are different in terms of hardware and software resources as well as locality: some span across multiple sites and multiple administrative domains whereas others are limited to a single site/domain. As a consequence, in order to scale their applica- tions up the scientists have to manage technical details for each target platform. From our point of view, this complexity should be hidden from the scientists who, in most cases, would prefer to focus on their research rather than spending time dealing with platform configuration concerns. In this article, we advocate for a system management framework that aims to automatically setup the whole run-time environment according to the applications needs. The main difference with regards to usual approaches is that they generally only focus on the software layer whereas we address both the hardware and the software expecta- tions through a unique system. For each application, scientists describe their requirements through the definition of a Virtual Platform (VP) and a Virtual System Environment (VSE). Relying on the VP/VSE definitions, the framework is in charge of: (i) the configuration of the physical infrastructure to satisfy the VP requirements, (ii) the setup of the VP, and (iii) the customization of the execution environment (VSE) upon the former VP. We propose a new formalism that the system can rely upon to successfully perform each of these three steps without burdening the user with the specifics of the configuration for the physical resources, and system management tools. This formalism leverages Goldberg s theory for recursive virtual machines by introducing new concepts based on system virtualization (identity, partitioning, aggregation) and emulation (simple, abstraction). This enables the definition of complex VP/VSE configurations

  1. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  2. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  3. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  4. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  5. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  6. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  7. Disposal systems evaluations and tool development : Engineered Barrier System (EBS) evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Rutqvist, Jonny (LBNL); Liu, Hui-Hai (LBNL); Steefel, Carl I. (LBNL); Serrano de Caro, M. A. (LLNL); Caporuscio, Florie Andre (LANL); Birkholzer, Jens T. (LBNL); Blink, James A. (LLNL); Sutton, Mark A. (LLNL); Xu, Hongwu (LANL); Buscheck, Thomas A. (LLNL); Levy, Schon S. (LANL); Tsang, Chin-Fu (LBNL); Sonnenthal, Eric (LBNL); Halsey, William G. (LLNL); Jove-Colon, Carlos F.; Wolery, Thomas J. (LLNL)

    2011-01-01

    Key components of the nuclear fuel cycle are short-term storage and long-term disposal of nuclear waste. The latter encompasses the immobilization of used nuclear fuel (UNF) and radioactive waste streams generated by various phases of the nuclear fuel cycle, and the safe and permanent disposition of these waste forms in geological repository environments. The engineered barrier system (EBS) plays a very important role in the long-term isolation of nuclear waste in geological repository environments. EBS concepts and their interactions with the natural barrier are inherently important to the long-term performance assessment of the safety case where nuclear waste disposition needs to be evaluated for time periods of up to one million years. Making the safety case needed in the decision-making process for the recommendation and the eventual embracement of a disposal system concept requires a multi-faceted integration of knowledge and evidence-gathering to demonstrate the required confidence level in a deep geological disposal site and to evaluate long-term repository performance. The focus of this report is the following: (1) Evaluation of EBS in long-term disposal systems in deep geologic environments with emphasis on the multi-barrier concept; (2) Evaluation of key parameters in the characterization of EBS performance; (3) Identification of key knowledge gaps and uncertainties; and (4) Evaluation of tools and modeling approaches for EBS processes and performance. The above topics will be evaluated through the analysis of the following: (1) Overview of EBS concepts for various NW disposal systems; (2) Natural and man-made analogs, room chemistry, hydrochemistry of deep subsurface environments, and EBS material stability in near-field environments; (3) Reactive Transport and Coupled Thermal-Hydrological-Mechanical-Chemical (THMC) processes in EBS; and (4) Thermal analysis toolkit, metallic barrier degradation mode survey, and development of a Disposal Systems

  8. Disposal systems evaluations and tool development: Engineered Barrier System (EBS) evaluation

    International Nuclear Information System (INIS)

    Rutqvist, Jonny; Liu, Hui-Hai; Steefel, Carl I.; Serrano de Caro, M.A.; Caporuscio, Florie Andre; Birkholzer, Jens T.; Blink, James A.; Sutton, Mark A.; Xu, Hongwu; Buscheck, Thomas A.; Levy, Schon S.; Tsang, Chin-Fu; Sonnenthal, Eric; Halsey, William G.; Jove-Colon, Carlos F.; Wolery, Thomas J.

    2011-01-01

    Key components of the nuclear fuel cycle are short-term storage and long-term disposal of nuclear waste. The latter encompasses the immobilization of used nuclear fuel (UNF) and radioactive waste streams generated by various phases of the nuclear fuel cycle, and the safe and permanent disposition of these waste forms in geological repository environments. The engineered barrier system (EBS) plays a very important role in the long-term isolation of nuclear waste in geological repository environments. EBS concepts and their interactions with the natural barrier are inherently important to the long-term performance assessment of the safety case where nuclear waste disposition needs to be evaluated for time periods of up to one million years. Making the safety case needed in the decision-making process for the recommendation and the eventual embracement of a disposal system concept requires a multi-faceted integration of knowledge and evidence-gathering to demonstrate the required confidence level in a deep geological disposal site and to evaluate long-term repository performance. The focus of this report is the following: (1) Evaluation of EBS in long-term disposal systems in deep geologic environments with emphasis on the multi-barrier concept; (2) Evaluation of key parameters in the characterization of EBS performance; (3) Identification of key knowledge gaps and uncertainties; and (4) Evaluation of tools and modeling approaches for EBS processes and performance. The above topics will be evaluated through the analysis of the following: (1) Overview of EBS concepts for various NW disposal systems; (2) Natural and man-made analogs, room chemistry, hydrochemistry of deep subsurface environments, and EBS material stability in near-field environments; (3) Reactive Transport and Coupled Thermal-Hydrological-Mechanical-Chemical (THMC) processes in EBS; and (4) Thermal analysis toolkit, metallic barrier degradation mode survey, and development of a Disposal Systems

  9. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  10. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  11. Software Tools for the Analysis of Functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi

    2012-09-01

    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized.

  12. Software Tools for the Analysis of functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi

    2012-12-01

    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized

  13. Management tools for distributed control system in KSTAR

    International Nuclear Information System (INIS)

    Sangil Lee; Jinseop Park; Jaesic Hong; Mikyung Park; Sangwon Yun

    2012-01-01

    The integrated control system of the Korea Superconducting Tokamak Advanced Research (KSTAR) has been developed with distributed control systems based on Experimental Physics and Industrial Control System (EPICS) middle-ware. It has the essential role of remote operation, supervising of tokamak device and conducting of plasma experiments without any interruption. Therefore, the availability of the control system directly impacts on the entire device performance. For the non-interrupted operation of the KSTAR control system, we have developed a tool named as Control System Monitoring (CSM) to monitor the resources of EPICS Input/Output Controller (IOC) servers (utilization of memory, cpu, disk, network, user-defined process and system-defined process), the soundness of storage systems (storage utilization, storage status), the status of network switches using Simple Network Management Protocol (SNMP), the network connection status of every local control sever using Internet Control Message Protocol (ICMP), and the operation environment of the main control room and the computer room (temperature, humidity, electricity) in real time. When abnormal conditions or faults are detected by the CSM, it alerts abnormal or fault alarms to operators. Especially, if critical fault related to the data storage occurs, the CSM sends the simple messages to operator's mobile phone. The operators then quickly restored the problems according to the emergency procedure. As a result of this process, KSTAR was able to perform continuous operation and experiment without interruption for 4 months

  14. EMGTools, an adaptive and versatile tool for detailed EMG analysis

    DEFF Research Database (Denmark)

    Nikolic, M; Krarup, C

    2010-01-01

    We have developed an EMG decomposition system called EMGTools that can extract the constituent MUAPs and firing patterns for quantitative analysis from the EMG signal recorded at slight effort for clinical evaluation. The aim was to implement a robust system able to handle the challenges...

  15. Smart roadside initiative macro benefit analysis : user's guide for the benefit-cost analysis tool.

    Science.gov (United States)

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of : various new transportation technologies at a State level and to provide results that could support technology adoption : by a State ...

  16. System Dynamics in Medical Education: A Tool for Life

    Science.gov (United States)

    Rubin, David M.; Richards, Christopher L.; Keene, Penelope A. C.; Paiker, Janice E.; Gray, A. Rosemary T.; Herron, Robyn F. R.; Russell, Megan J.; Wigdorowitz, Brian

    2012-01-01

    A course in system dynamics has been included in the first year of our university's six-year medical curriculum. System Dynamics is a discipline that facilitates the modelling, simulation and analysis of a wide range of problems in terms of two fundamental concepts viz. rates and levels. Many topics encountered in the medical school curriculum,…

  17. [Scientific information systems: tools for measures of biomedical research impact].

    Science.gov (United States)

    Navarrete Cortés, José; Banqueri Ozáez, Jesús

    2008-12-01

    The present article provides an analysis and description of the use of scientific information systems as instruments to measure and monitor results and investigative activity in biomedicine. Based on the current situation concerning the use and implementation of these systems, we offer a detailed description of the actors of these systems and propose a functional architecture for this class of software. In addition, the instruments that these types of systems offer for the measurement of the impact of the results of research are described in depth, as these instruments can help in decision making. Finally, a selection of national and international scientific information systems are listed and reviewed.

  18. Applicability of Performance Assessment Tools to Marine Corps Air Ground Task Force C4 System of Systems Performance Assessment

    Science.gov (United States)

    2010-09-01

    applications, SAVE conducts static analysis of software systems written in a variety of languages (C/C++, Java, Delphi, Simulink , and others) to...commercial software tools such as Solar Winds ’ Orion Network Performance Monitor associated with the Joint Network Management System (JNMS) program of...from http://www.incose.org/hra/past_events/ IEEE2009_SoSEandFoSE_JOC_090223. pdf Cloninger, J. (2009). Marine corps cyberspace in support of MAGTF C2

  19. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  20. DART system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Boggs, Paul T.; Althsuler, Alan (Exagrid Engineering); Larzelere, Alex R. (Exagrid Engineering); Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.

  1. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  2. CAE tools for verifying high-performance digital systems

    Science.gov (United States)

    Rubin, Lawrence M.

    1991-04-01

    The high signal density clock rates and output slew rates associated with state-of-the-art digital logic renders design verification of modern high performance digital systems to be increasingly difficult and critical. The electrical and mechanical design of these circuits must be analyzed in conjunction with the analysis and simulation of the underlying logic design to result in operational and manufacturable systems. This paper will describe several useful techniques relevant to verification of such systems including static timing verification transmission line simulation and crosstalk analysis. The principles underlying these techniques will be discussed and results for sample algorithms given.

  3. Energy Signal Tool for Decision Support in Building Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Henze, G. P.; Pavlak, G. S.; Florita, A. R.; Dodier, R. H.; Hirsch, A. I.

    2014-12-01

    A prototype energy signal tool is demonstrated for operational whole-building and system-level energy use evaluation. The purpose of the tool is to give a summary of building energy use which allows a building operator to quickly distinguish normal and abnormal energy use. Toward that end, energy use status is displayed as a traffic light, which is a visual metaphor for energy use that is either substantially different from expected (red and yellow lights) or approximately the same as expected (green light). Which light to display for a given energy end use is determined by comparing expected to actual energy use. As expected, energy use is necessarily uncertain; we cannot choose the appropriate light with certainty. Instead, the energy signal tool chooses the light by minimizing the expected cost of displaying the wrong light. The expected energy use is represented by a probability distribution. Energy use is modeled by a low-order lumped parameter model. Uncertainty in energy use is quantified by a Monte Carlo exploration of the influence of model parameters on energy use. Distributions over model parameters are updated over time via Bayes' theorem. The simulation study was devised to assess whole-building energy signal accuracy in the presence of uncertainty and faults at the submetered level, which may lead to tradeoffs at the whole-building level that are not detectable without submetering.

  4. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    Science.gov (United States)

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  5. Nanocoatings for High-Efficiency Industrial Hydraulic and Tooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Clifton B. Higdon III

    2011-01-07

    Industrial manufacturing in the U.S. accounts for roughly one third of the 98 quadrillion Btu total energy consumption. Motor system losses amount to 1.3 quadrillion Btu, which represents the largest proportional loss of any end-use category, while pumps alone represent over 574 trillion BTU (TBTU) of energy loss each year. The efficiency of machines with moving components is a function of the amount of energy lost to heat because of friction between contacting surfaces. The friction between these interfaces also contributes to downtime and the loss of productivity through component wear and subsequent repair. The production of new replacement parts requires additional energy. Among efforts to reduce energy losses, wear-resistant, low-friction coatings on rotating and sliding components offer a promising approach that is fully compatible with existing equipment and processes. In addition to lubrication, one of the most desirable solutions is to apply a protective coating or surface treatment to rotating or sliding components to reduce their friction coefficients, thereby leading to reduced wear. Historically, a number of materials such as diamond-like carbon (DLC), titanium nitride (TiN), titanium aluminum nitride (TiAlN), and tungsten carbide (WC) have been examined as tribological coatings. The primary objective of this project was the development of a variety of thin film nanocoatings, derived from the AlMgB14 system, with a focus on reducing wear and friction in both industrial hydraulics and cutting tool applications. Proof-of-concept studies leading up to this project had shown that the constituent phases, AlMgB14 and TiB2, were capable of producing low-friction coatings by pulsed laser deposition. These coatings combine high hardness with a low friction coefficient, and were shown to substantially reduce wear in laboratory tribology tests. Selection of the two applications was based largely on the concept of improved mechanical interface efficiencies for

  6. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  7. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  8. The Planetary Data System (PDS) Data Dictionary Tool (LDDTool)

    Science.gov (United States)

    Raugh, Anne C.; Hughes, John S.

    2017-10-01

    One of the major design goals of the PDS4 development effort was to provide an avenue for discipline specialists and large data preparers such as mission archivists to extend the core PDS4 Information Model (IM) to include metadata definitions specific to their own contexts. This capability is critical for the Planetary Data System - an archive that deals with a data collection that is diverse along virtually every conceivable axis. Amid such diversity, it is in the best interests of the PDS archive and its users that all extensions to the core IM follow the same design techniques, conventions, and restrictions as the core implementation itself. Notwithstanding, expecting all mission and discipline archivist seeking to define metadata for a new context to acquire expertise in information modeling, model-driven design, ontology, schema formulation, and PDS4 design conventions and philosophy is unrealistic, to say the least.To bridge that expertise gap, the PDS Engineering Node has developed the data dictionary creation tool known as “LDDTool”. This tool incorporates the same software used to maintain and extend the core IM, packaged with an interface that enables a developer to create his contextual information model using the same, open standards-based metadata framework PDS itself uses. Through this interface, the novice dictionary developer has immediate access to the common set of data types and unit classes for defining attributes, and a straight-forward method for constructing classes. The more experienced developer, using the same tool, has access to more sophisticated modeling methods like abstraction and extension, and can define very sophisticated validation rules.We present the key features of the PDS Local Data Dictionary Tool, which both supports the development of extensions to the PDS4 IM, and ensures their compatibility with the IM.

  9. Bibliological analysis: security tool in collections of rare books

    Directory of Open Access Journals (Sweden)

    Raphael Diego Greenhalgh

    2015-04-01

    Full Text Available The historical-cultural and economic value of rare book makes the same is often the object of robbery and theft of rare books. Therefore, the guardians of this type of library institutions have to strengthen their safety in order to inhibit such criminal practice. This paper, through literature review has the objective analyzing the possibility of using bibliological analysis as a security tool. Because with the detailed description of the copies is possible to increase the knowledge about the collection of an institution and attribute unequivocal property to rare specimens, that not always receive stamps or other marks, making it difficult to return the books in case of robbery and theft and possible recovery thereof. Therefore, the bibliological analysis individualizes the exemplary, besides allowing the handling and extensive knowledge of this, essential factors to security of the books.

  10. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...... and parameterised reward annotations. SBOAT allows the optimisation of these processes by specifying optimisation goals by means of probabilistic control tree logic (PCTL). Optimisation is performed by means of an evolutionary algorithm where stochastic model checking, in the form of the PRISM model checker......, is used to compute the fitness, the performance of a candidate in terms of the specified goals, of variants of a process. Our evolutionary algorithm approach uses a matrix representation of process models to efficiently allow mutation and crossover of a process model to be performed, allowing broad...

  11. metaSNV: A tool for metagenomic strain level analysis.

    Directory of Open Access Journals (Sweden)

    Paul Igor Costea

    Full Text Available We present metaSNV, a tool for single nucleotide variant (SNV analysis in metagenomic samples, capable of comparing populations of thousands of bacterial and archaeal species. The tool uses as input nucleotide sequence alignments to reference genomes in standard SAM/BAM format, performs SNV calling for individual samples and across the whole data set, and generates various statistics for individual species including allele frequencies and nucleotide diversity per sample as well as distances and fixation indices across samples. Using published data from 676 metagenomic samples of different sites in the oral cavity, we show that the results of metaSNV are comparable to those of MIDAS, an alternative implementation for metagenomic SNV analysis, while data processing is faster and has a smaller storage footprint. Moreover, we implement a set of distance measures that allow the comparison of genomic variation across metagenomic samples and delineate sample-specific variants to enable the tracking of specific strain populations over time. The implementation of metaSNV is available at: http://metasnv.embl.de/.

  12. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  13. Substitution dynamical systems spectral analysis

    CERN Document Server

    Queffélec, Martine

    2010-01-01

    This volume mainly deals with the dynamics of finitely valued sequences, and more specifically, of sequences generated by substitutions and automata. Those sequences demonstrate fairly simple combinatorical and arithmetical properties and naturally appear in various domains. As the title suggests, the aim of the initial version of this book was the spectral study of the associated dynamical systems: the first chapters consisted in a detailed introduction to the mathematical notions involved, and the description of the spectral invariants followed in the closing chapters. This approach, combined with new material added to the new edition, results in a nearly self-contained book on the subject. New tools - which have also proven helpful in other contexts - had to be developed for this study. Moreover, its findings can be concretely applied, the method providing an algorithm to exhibit the spectral measures and the spectral multiplicity, as is demonstrated in several examples. Beyond this advanced analysis, many...

  14. Financial tools to induce cooperation in power asymmetrical water systems

    Science.gov (United States)

    Denaro, Simona; Castelletti, Andrea; Giuliani, Matteo; Characklis, Gregory W.

    2017-04-01

    In multi-purpose water systems, power asymmetry is often responsible of inefficient and inequitable water allocations. Climate Change and anthropogenic pressure are expected to exacerbate such disparities at the expense of already disadvantaged groups. The intervention of a third party, charged with redefining water sharing policies to give greater consideration to equity and social justice, may be desirable. Nevertheless, to be accepted by private actors, this interposition should be coupled with some form of compensation. For a public agency, compensation measures may be burdensome, especially when the allowance is triggered by natural events whose timing and magnitude are subject to uncertainty. In this context, index based insurance contracts may represent a viable alternative option and reduce the cost of achieving socially desirable outcomes. In this study we explore soft measures to achieve global change mitigation by designing a hybrid coordination mechanism composed of i) a direct normative constraint and ii) an indirect financial compensatory tool. The performance of an index-based insurance (i.e. hedging) contract to be used as a compensation tool is evaluated relative to more traditional alternatives. First, the performance of the status quo system, or baseline (BL), is contrasted to an idealized scenario in which a central planner (CP) maximizes global efficiency. Then, the CP management is analyzed in order to identify an efficient water rights redistribution to be legally imposed on the advantaged stakeholders in the BL scenario. Finally, a hedging contract is designed to compensate those stakeholders more negatively affected by the legal constraint. The approach is demonstrated on a multi-purpose water system in Italy, where different decision makers individually manage the same resource. The system is characterized by a manifest power asymmetry: the upstream users, i.e., hydropower companies, are free to release their stored water in time

  15. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    Science.gov (United States)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  16. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  17. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  18. Simulation of spin dynamics: a tool in MRI system development

    International Nuclear Information System (INIS)

    Stoecker, Tony; Vahedipour, Kaveh; Shah, N Jon

    2011-01-01

    Magnetic Resonance Imaging (MRI) is a routine diagnostic tool in the clinics and the method of choice in soft-tissue contrast medical imaging. It is an important tool in neuroscience to investigate structure and function of the living brain on a systemic level. The latter is one of the driving forces to further develop MRI technology, as neuroscience especially demands higher spatiotemporal resolution which is to be achieved through increasing the static main magnetic field, B 0 . Although standard MRI is a mature technology, ultra high field (UHF) systems, at B 0 ≥ 7 T, offer space for new technical inventions as the physical conditions dramatically change. This work shows that the development strongly benefits from computer simulations of the measurement process on the basis of a semi-classical, nuclear spin-1/2 treatment given by the Bloch equations. Possible applications of such simulations are outlined, suggesting new solutions to the UHF-specific inhomogeneity problems of the static main field as well as the high-frequency transmit field.

  19. Plant Lectins as Medical Tools against Digestive System Cancers.

    Science.gov (United States)

    Estrada-Martínez, Laura Elena; Moreno-Celis, Ulisses; Cervantes-Jiménez, Ricardo; Ferriz-Martínez, Roberto Augusto; Blanco-Labra, Alejandro; García-Gasca, Teresa

    2017-07-03

    Digestive system cancers-those of the esophagus, stomach, small intestine, colon-rectum, liver, and pancreas-are highly related to genetics and lifestyle. Most are considered highly mortal due to the frequency of late diagnosis, usually in advanced stages, caused by the absence of symptoms or masked by other pathologies. Different tools are being investigated in the search of a more precise diagnosis and treatment. Plant lectins have been studied because of their ability to recognize and bind to carbohydrates, exerting a variety of biological activities on animal cells, including anticancer activities. The present report integrates existing information on the activity of plant lectins on various types of digestive system cancers, and surveys the current state of research into their properties for diagnosis and selective treatment.

  20. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... speed doubly-fed induction generator wind turbine concept 3. Variable speed multi-pole permanent magnet synchronous generator wind turbine concept These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies......, connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control...

  1. Phronesis, a diagnosis and recovery tool for system administrators

    International Nuclear Information System (INIS)

    Haen, C; Barra, V; Bonaccorsi, E; Neufeld, N

    2014-01-01

    The LHCb experiment relies on the Online system, which includes a very large and heterogeneous computing cluster. Ensuring the proper behavior of the different tasks running on the more than 2000 servers represents a huge workload for the small operator team and is a 24/7 task. At CHEP 2012, we presented a prototype of a framework that we designed in order to support the experts. The main objective is to provide them with steadily improving diagnosis and recovery solutions in case of misbehavior of a service, without having to modify the original applications. Our framework is based on adapted principles of the Autonomic Computing model, on Reinforcement Learning algorithms, as well as innovative concepts such as Shared Experience. While the submission at CHEP 2012 showed the validity of our prototype on simulations, we here present an implementation with improved algorithms and manipulation tools, and report on the experience gained with running it in the LHCb Online system.

  2. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  3. ISRU System Model Tool: From Excavation to Oxygen Production

    Science.gov (United States)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  4. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  5. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  6. A rule-generating tool for expert systems

    International Nuclear Information System (INIS)

    Alten, S.; Danofsky, R.A.

    1991-01-01

    A program (IDCLASS) has been developed to generate rules for expert systems used in diagnosis-type problems. ID-CLASS combines the well-known ID-3 inductive algorithm 1 and the OPS5 expert system language for efficiency and utilizes a strategy to minimize the number of rules. IDCLASS, written in FORTRAN77, performs an inductive learning task to create the knowledge base. Additionally, it creates on OPS5 source code as an expert system with runtime learning ability. IDCLASS requires no programming knowledge to use it, but the expert system created by IDCLASS requires an OPS5 compiler. A six-attribute system is considered in an example case study with a set of 13 training examples prepared by using the emergency response plan procedures of the Iowa State University UTR-10 reactor. It was found that IDCLASS is a useful tool for classification problems IDCLASS must be modified to include data monitoring, new rule verifying, and advice giving rules by using flexible features of the OPS5 language

  7. Mathematical modeling of physiological systems: an essential tool for discovery.

    Science.gov (United States)

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  9. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  10. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  11. Energy Systems Modelling Research and Analysis

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Alberg Østergaard, Poul

    2015-01-01

    This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses....

  12. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Bekavac, Ivan; Garbin Praničević, Daniela

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  13. System-level tools and reconfigurable computing for next-generation HWIL systems

    Science.gov (United States)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  14. Frontiers of performance analysis on leadership-class systems

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, R [Renaissance Computing Institute, UNC, Chapel Hill, North Carolina (United States); Adhianto, L; Fagan, M; Krentel, M; Mellor-Crummey, J; Tallent, N [Rice University, Houston, Texas (United States); Supinski, B de; Gamblin, T; Schulz, M, E-mail: rjf@renci.or, E-mail: laksono@cs.rice.ed, E-mail: bronis@llnl.go, E-mail: mfagan@cs.rice.ed, E-mail: tgamblin@llnl.go, E-mail: krentel@cs.rice.ed, E-mail: johnmc@cs.rice.ed, E-mail: schulzm@llnl.go, E-mail: tallent@cs.rice.ed [Lawrence Livermore National Laboratory (United States)

    2009-07-01

    The number of cores in high-end systems for scientific computing are employingis increasing rapidly. As a result, there is an pressing need for tools that can measure, model, and diagnose performance problems in highly-parallel runs. We describe two tools that employ complementary approaches for analysis at scale and we illustrate their use on DOE leadership-class systems.

  15. Frontiers of Performance Analysis on Leadership-Class Systems

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, R J; Adhianto, L; de Supinski, B R; Fagan, M; Gamblin, T; Krentel, M; Mellor-Crummey, J; Schulz, M; Tallent, N

    2009-06-15

    The number of cores in high-end systems for scientific computing are employing is increasing rapidly. As a result, there is an pressing need for tools that can measure, model, and diagnose performance problems in highly-parallel runs. We describe two tools that employ complementary approaches for analysis at scale and we illustrate their use on DOE leadership-class systems.

  16. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  17. Sensitivity analysis of an information fusion tool: OWA operator

    Science.gov (United States)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  18. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  19. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    Science.gov (United States)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  20. An analysis of the Grade 3 Department of Basic Education workbooks as curriculum tools

    Directory of Open Access Journals (Sweden)

    Ursula Hoadley

    2016-12-01

    Full Text Available Since 2011, the Department of Basic Education (DBE, has provided all Grade 1 to 6 learners in public schools with literacy/language, numeracy/mathematics and life skills workbooks. This study provides an assessment of the purpose to which the workbooks are best suited by analysing the Grade 3 Mathematics and Home Language English workbooks for 2015 in the light of the DBE’s intentions for the workbooks. The study considers alignment of the workbooks with the official national curriculum, the Curriculum and Assessment Policy (CAPS, as well as ‘conceptual signalling’ and progression in the content of the workbooks. We then evaluate the kind of ‘curriculum tool’ the workbooks in their current format conform to. We explore three possibilities in the light of the DBE’s proposals for the workbook use: a practice tool; an assessment tool; and a monitoring tool. We also reflect on the workbooks as a teaching tool. Our analysis suggests that, in line with the DBE’s intended purpose for the workbooks, that the workbooks best represent a practice curriculum tool. We highlight the significance of the high level of curriculum compliance of the workbook, and indicate that this would render them an effective monitoring tool for assessing the quantitative coverage of the curriculum at a systemic level.