Sample records for analysis system tool

  1. Two energy system analysis - tools

    Lund, Henrik; Andersen, Anders N.; Antonoff, Jayson


    The chapter illustrates some of the different types of problems that must be solved when analysing energy systems.......The chapter illustrates some of the different types of problems that must be solved when analysing energy systems....

  2. Integrated tools for control-system analysis

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.


    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  3. Ultrasonic vibrating system design and tool analysis

    Kei-Lin KUO


    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  4. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Morris, Gladys M.; Sheppard, Mark A.


    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  5. System-of-Systems Technology-Portfolio-Analysis Tool

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne


    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  6. The environment power system analysis tool development program

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.


    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  7. Tools and Algorithms for Construction and Analysis of Systems

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...... model checking, model-checking tools, symbolic model checking, visual tools, and verification of critical systems....

  8. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    Lorentzen, Torsten; Blanke, Mogens


    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....... properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system...

  9. Development of data analysis tool for combat system integration

    Shin, Seung-Chun; Shin, Jong-Gye; Oh, Dae-Kyun


    System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT) for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  10. Tools and Algorithms for the Construction and Analysis of Systems

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  11. Shape Analysis for Complex Systems Using Information Geometry Tools.

    Sanctis, Angela De


    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  12. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony


    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  13. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.


    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  14. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Truong, Long V.


    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  15. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    Blanke, Mogens; Lorentzen, Torsten


    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  16. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.


    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  17. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Ezio Bartocci


    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  18. An Integrated Tool for System Analysis of Sample Return Vehicles

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.


    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  19. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    McCarthy, Kevin; Hodge, Ernie


    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  20. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Liu, Nan-Suey; Quealy, Angela


    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  1. Tool for Sizing Analysis of the Advanced Life Support System

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.


    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    Bartocci, Ezio; Lió, Pietro


    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...

  3. Evaluating fluid behavior in advanced reactor systems using coupled computational fluid dynamics and systems analysis tools

    Simulation of some fluid phenomena associated with Generation IV reactors require the capability of modeling mixing in two- or three-dimensional flow. At the same time, the flow condition of interest is often transient and depends upon boundary conditions dictated by the system behavior as a whole. Computational Fluid Dynamics (CFD) is an ideal tool for simulating mixing and three-dimensional flow in system components, whereas a system analysis tool is ideal for modeling the entire system. This paper presents the reasoning which has led to coupled CFD and systems analysis code software to analyze the behavior of advanced reactor fluid system behavior. In addition, the kinds of scenarios where this capability is important are identified. The important role of a coupled CFD/systems analysis code tool in the overall calculation scheme for a Very High Temperature Reactor is described. The manner in which coupled systems analysis and CFD codes will be used to evaluate the mixing behavior in a plenum for transient boundary conditions is described. The calculation methodology forms the basis for future coupled calculations that will examine the behavior of such systems at a spectrum of conditions, including transient accident conditions, that define the operational and accident envelope of the subject system. The methodology and analysis techniques demonstrated herein are a key technology that in part forms the backbone of the advanced techniques employed in the evaluation of advanced designs and their operational characteristics for the Generation IV advanced reactor systems. (authors)

  4. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Pakarinen Jyri


    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  5. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Lee, HauHua; Kolb, Mark; Madelone, Jack


    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  6. SYSTID - A flexible tool for the analysis of communication systems.

    Dawson, C. T.; Tranter, W. H.


    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  7. BiologicalNetworks: visualization and analysis tool for systems biology

    Baitaluk, Michael; Sedova, Mayya; Ray, Animesh; Gupta, Amarnath


    Systems level investigation of genomic scale information requires the development of truly integrated databases dealing with heterogeneous data, which can be queried for simple properties of genes or other database objects as well as for complex network level properties, for the analysis and modelling of complex biological processes. Towards that goal, we recently constructed PathSys, a data integration platform for systems biology, which provides dynamic integration over a diverse set of dat...

  8. Analysis tool for predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems

    An analysis tool is constructed for the purpose of predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems. The basic requirements of such an analysis tool are established, and documentation is presented for several fluid transient computer codes which were considered for the tool. The code modifications necessary to meet the analysis tool requirements are described. To test the computational capability of the analysis tool a verification problem is considered and the results reported. These results serve only to demonstrate the applicability of the analysis tool to this type of problem; the code has not been validated by comparison with experiment. Documentation is provided for a brief sensitivity study involving the sample problem. Additional analysis tool information, as well as detailed sample problem results are included in the form of appendices

  9. Design and Analysis Tools for Concurrent Blackboard Systems

    McManus, John W.


    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  10. Betweenness as a Tool of Vulnerability Analysis of Power System

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar


    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  11. Jitterbug and TrueTime: Analysis Tools for Real-Time Control Systems

    Cervin, Anton; Henriksson, Dan; Lincoln, Bo; Årzén, Karl-Erik


    The paper presents two recently developed, Matlab-based analysis tools for realtime control systems. The first tool, called Jitterbug, is used to compute a performance criterion for a control loop under various timing conditions. The tool makes it easy to quickly judge how sensitive a controller is to implementation effects such as slow sampling, delays, jitter, etc. The second tool, called TrueTime, allows detailed co-simulation of process dynamics, control task execution, and network commun...

  12. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Andruski, Joel; Drennen, Thomas E.


    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  13. Benchmarking expert system tools

    Riley, Gary


    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  14. Delila system tools.

    Schneider, T D; Stormo, G D; Yarus, M A; Gold, L


    We introduce three new computer programs and associated tools of the Delila nucleic-acid sequence analysis system. The first program, Module, allows rapid transportation of new sequence analysis tools between scientists using different computers. The second program, DBpull, allows efficient access to the large nucleic-acid sequence databases being collected in the United States and Europe. The third program, Encode, provides a flexible way to process sequence data for analysis by other programs.

  15. Transportation routing analysis geographic information system - tragis, progress on improving a routing tool

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental United States. This paper outlines some of the features available in this model. (authors)

  16. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  17. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    Kinter, James L., III; Doty, Brian E.


    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  18. An ontological knowledge based system for selection of process monitoring and analysis tools

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul


    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search......Efficient process monitoring and analysis tools provide the means for automated supervision and control of manufacturing plants and therefore play an important role in plant safety, process control and assurance of end product quality. The availability of a large number of different process...

  19. FADES: A tool for automated fault analysis of complex systems

    FADES is an Expert System for performing fault analyses on complex connected systems. By using a graphical editor to draw components and link them together the FADES system allows the analyst to describe a given system. The knowledge base created is used to qualitatively simulate the system behaviour. By inducing all possible component failures in the system and determining their effects, a set of facts is built up. These facts are then used to create Fault Trees, or FMEA tables. The facts may also be used for explanation effects and to generate diagnostic rules allowing system instrumentation to be optimised. The prototype system has been built and tested and is preently undergoing testing by users. All comments from these trials will be used to tailor the system to the requirements of the user so that the end product performs the exact task required

  20. Design and Analysis Tools for Deployable Solar Array Systems Project

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  1. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.


    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  2. Extension of a System Level Tool for Component Level Analysis

    Majumdar, Alok; Schallhorn, Paul


    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  3. ProbFAST: Probabilistic Functional Analysis System Tool

    Oliveira Thiago YK


    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at

  4. A software tool for design of process monitoring and analysis systems

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul


    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  5. Child Language Data Exchange System Tools for Clinical Analysis.

    MacWhinney, Brian; Fromm, Davida


    The Child Language Data Exchange System Project has developed methods for analyzing many aspects of child language development, including grammar, lexicon, discourse, gesture, phonology, and fluency. This article will describe the methods available for each of these six fields, and how they can be used for assessment in the clinical setting. PMID:27111267

  6. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.


    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  7. Transient analysis of power systems solution techniques, tools and applications

    Martinez-Velasco, J


    A comprehensive introduction and up-to-date reference to SiC power semiconductor devices covering topics from material properties to applicationsBased on a number of breakthroughs in SiC material science and fabrication technology in the 1980s and 1990s, the first SiC Schottky barrier diodes (SBDs) were released as commercial products in 2001.  The SiC SBD market has grown significantly since that time, and SBDs are now used in a variety of power systems, particularly switch-mode power supplies and motor controls.  SiC power MOSFETs entered commercial production in 2011, providing rugged, hig

  8. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard;


    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  9. Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis

    Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige


    We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.

  10. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"


    This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  11. Systems Analysis – a new paradigm and decision support tools for the water framework directive

    Bruen, M.


    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for...

  12. Systems analysis – a new paradigm and decision support tools for the water framework directive

    Bruen, M.


    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this ...

  13. Micro electrochemical sensors and PCR systems: cellular and molecular tools for wine yeast analysis

    Ress, Cristina


    Nowadays, exciting bioanalytical microsystems are currently receiving increasing attention in biology since they can comply with the considerable demand for reliable, sensitive and low-cost analysis tools. Small reagents volumes, low power consumption, portability, fast analysis, high throughput and systems integration are the key aspects that make these systems more and more appealing within both the academic and industrial communities. In the last years, many microdevices were developed for...

  14. Using EPSAT to analyze high power systems in the space environment. [Environment Power System Analysis Tool

    Kuharski, Robert A.; Jongeward, Gary A.; Wilcox, Katherine G.; Rankin, Tom R.; Roche, James C.


    The authors review the Environment Power System Analysis Tool (EPSAT) design and demonstrate its capabilities by using it to address some questions that arose in designing the SPEAR III experiment. It is shown that that the rocket body cannot be driven to large positive voltages under the constraints of this experiment. Hence, attempts to measure the effects of a highly positive rocket body in the plasma environment should not be made in this experiment. It is determined that a hollow cathode will need to draw only about 50 mA to ground the rocket body. It is shown that a relatively small amount of gas needs to be released to induce a bulk breakdown near the rocket body, and this gas release should not discharge the sphere. Therefore, the experiment provides an excellent opportunity to study the neutralization of a differential charge.

  15. Building energy analysis tool

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars


    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  16. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan


    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  17. Analysis of Java Client/Server and Web Programming Tools for Development of Educational Systems.

    Muldner, Tomasz

    This paper provides an analysis of old and new programming tools for development of client/server programs, particularly World Wide Web-based programs. The focus is on development of educational systems that use interactive shared workspaces to provide portable and expandable solutions. The paper begins with a short description of relevant terms.…

  18. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    Clavreul, Julie

    are presented. First a review was carried out on all LCA studies of waste management systems published before mid-2012. This provided a global overview of the technologies and waste fractions which have attracted focus within LCA while enabling an analysis of methodological tendencies, the use of tools...... to be modelled rather than monitored as in classical LCA (e.g. landfilling or the application of processed waste on agricultural land). Therefore LCA-tools are needed which specifically address these issues and enable practitioners to model properly their systems. In this thesis several pieces of work...... and databases and the application of uncertainty analysis methods. The major outcome of this thesis was the development of a new LCA model, called EASETECH, building on the experience with previous LCA-tools, in particular the EASEWASTE model. Before the actual implementation phase, a design phase involved...

  19. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering


    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  20. User manual of the CATSS system (version 1.0) communication analysis tool for space station

    Tsang, C. S.; Su, Y. T.; Lindsey, W. C.


    The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.

  1. SBML-SAT: a systems biology markup language (SBML based sensitivity analysis tool

    Rundell Ann E


    Full Text Available Abstract Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  2. Development of reliability analysis tools and database for Integrated Risk Management System (IRMS)

    The Korea Occupational Safety and Health Agency (KOSHA) is developing an Integrated Risk Management System (IRMS) to support the quantification and management of risk from chemical plants. The IRMS system includes the development of methodology, software tools and database necessary for quantitative risk assessment, which are consequence analysis software, graphical display of results on a geometric map, reliability analysis software, component reliability database, and equipment and hazardous material information databases. An overview of the IRMS will be presented in another paper, 'GIS-based IRMS.' The quantification of a risk consists of two major parts: one is a deterministic analysis, such as the consequence analysis of an explosion of flammable material, and the other is a probabilistic part such as the frequency analysis of and explosion or a reliability analysis of the protection system. This paper describes the development work in a probabilistic part of the IRMS. (author)

  3. A Study on Performance Analysis Tools for Applications Running on Large Distributed Systems

    De Sarkar, Ajanta; Mukherjee, Nandini


    The evolution of distributed architectures and programming paradigms for performance-oriented program development, challenge the state-of-the-art technology for performance tools. The area of high performance computing is rapidly expanding from single parallel systems to clusters and grids of heterogeneous sequential and parallel systems. Performance analysis and tuning applications is becoming crucial because it is hardly possible to otherwise achieve the optimum performance of any applicati...

  4. Dynamic Contingency Analysis Tool


    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  5. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    Samareh, Jamshid A.


    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  6. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    Samareh, Jamshid A.


    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  7. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark


    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  8. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Bardina, Jorge


    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  9. A new tool for the performance analysis of massively parallel computer systems

    Stefanek, Anton; Bradley, Jeremy; 10.4204/EPTCS.28.11


    We present a new tool, GPA, that can generate key performance measures for very large systems. Based on solving systems of ordinary differential equations (ODEs), this method of performance analysis is far more scalable than stochastic simulation. The GPA tool is the first to produce higher moment analysis from differential equation approximation, which is essential, in many cases, to obtain an accurate performance prediction. We identify so-called switch points as the source of error in the ODE approximation. We investigate the switch point behaviour in several large models and observe that as the scale of the model is increased, in general the ODE performance prediction improves in accuracy. In the case of the variance measure, we are able to justify theoretically that in the limit of model scale, the ODE approximation can be expected to tend to the actual variance of the model.

  10. Thermo-energetic Analysis of the Fluid Systems in Cutting Machine Tools

    Weber, Juliane; Lohse, Harald; Weber, Jürgen


    Controlling the thermo-elastic behavior of tooling machines can only be achieved by systematic analysis, characterization and design of their fluidic system. In the first stage of this project, fundamental work was done to develop simulation methods for the calculation of the thermodynamic behavior of a representative example of a milling machine and each of its components. With experimental and numerical data it was proven, that significant improvement can be achieved by a proper design of h...

  11. Failure Environment Analysis Tool (FEAT)

    Lawler, D. G.


    Information is given in viewgraph form on the Failure Environment Analysis Tool (FEAT), a tool designed to demonstrate advanced modeling and analysis techniques to better understand and capture the flow of failures within and between elements of the Space Station Freedom (SSF) and other large complex systems. Topics covered include objectives, development background, the technical approach, SSF baseline integration, and FEAT growth and evolution.

  12. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.


    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  13. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    Stecki, J. S.; Conrad, Finn; Oh, B.


    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying......Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...

  14. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.


    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  15. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.


    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  16. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    Csank, Jeffrey T.; Zinnecker, Alicia M.


    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  17. Pickering tool management system

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  18. Performance Analysis using CPN Tools

    Wells, Lisa Marie


    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...

  19. FURAX: assistance tools for the qualitative and quantitative analysis of systems reliability

    FURAX is a set of tools for the qualitative and quantitative safety analysis of systems functioning. It is particularly well adapted to the study of networks (fluids, electrical..), i.e. systems in which importance is functionally given to a flux. The analysis is based on modeling which privileges these fluxes (skeleton representation of the system for a network, functional diagram for a non single-flux system) and on the representation of components support systems. Qualitative analyses are based on the research for possible flux ways and on the technical domain knowledge. The results obtained correspond to a simplified failure mode analysis, to fault-trees relative to the events expected by the user and to minimum sections. The possible calculations on these models are: tree calculations, Markov diagram calculations of the system reliability, and probabilistic calculation of a section viewed as a tree, as a well-ordered sequence of failures, or as the absorbing state of a Markov diagram. (J.S.). 6 refs

  20. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities. The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.

  1. Tav4SB: integrating tools for analysis of kinetic models of biological systems

    Rybiński Mikołaj


    Full Text Available Abstract Background Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. Results The Taverna services for Systems Biology (Tav4SB project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project’s Web page: Conclusions The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  2. Physics analysis tools

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  3. Systems Analysis – a new paradigm and decision support tools for the water framework directive

    M. Bruen


    Full Text Available In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE and the Analytical Hierarchy Process (AHP are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.

  4. Systems Analysis - a new paradigm and decision support tools for the water framework directive

    Bruen, M.


    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.

  5. SICOMAT : a system for SImulation and COntrol analysis of MAchine Tools

    Gautier, Maxime; Pham, Minh Tu; Khalil, Wisama; Lemoine, Philippe; Poignet, Philippe


    This paper presents a software package for the simulation and the control analysis of machine tool axes. This package which is called SICOMAT (SImulation and COntrol analysis of MAchine Tools), provides a large variety of toolboxes to analyze the behavior and the control of the machine. The software takes into account several elements such as the flexibility of bodies, the interaction between several axes, the effect of numerical control and the availability to reduce models.

  6. Hurricane Data Analysis Tool

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory


    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL:, to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  7. Design and analysis of a reconfigurable discrete pin tooling system for molding of three-dimensional free-form objects

    KOÇ, Bahattin; Koc, Bahattin; Thangaswamy, Sridhar


    This paper presents the design and analysis of a new reconfigurable tooling for the fabrication of three-dimensional (3D) free-form objects. The proposed reconfigurable tooling system comprises a set of matrices of a closely stacked discrete elements (i.e., pins) arranged to form a cavity in which a free-form object can be molded. By reconfiguring the pins, a single tool can be used in the place of multiple tools to produce different parts with the involvement of much lesser time and cost. Th...

  8. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei


    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to

  9. Analysis tools for simulation of hybrid systems; Herramientas de analisis para simulacion de sistemas hibridos

    Guillen S, Omar; Mejia N, Fortino [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)


    In order to facilitate and to simplify the development and analysis of a Hybrid System in reference to its design, construction, operation and maintenance, it turns out optimal to carry out the simulation of this one by means of software, with which a significant reduction in the investment costs is obtained. Given the mix of technology of electrical generation which is involved in a hybrid system, it is very important to have a tool integrated with specialized packages of calculation (software), that allow to carry out the simulation tasks of the operational functioning of these systems. Combined with the former, one must not fail to consider the operation characteristics, the facilities of the user, the clarity in the obtained results and the possibility of its validation with respect to prototypes orchestrated in field. Equally, it is necessary to consider the identification of tasks involved in relation to the place of installation of this electrification technology. At the moment, the hybrid systems technology still is in a stage of development in the international level, and exist important limitations as far as the methodology availability and engineering tools for the optimum design of these systems. With the development of this paper, it is intended to contribute to the advance of the technology and to count on own tools to solve the described series of problems. In this article are described the activities that more impact have in the design and development of hybrid systems, as well as the identification of variables, basic characteristics and form of validation of tools in the integration of a methodology for the simulation of these systems, facilitating their design and development. [Spanish] Para facilitar y simplificar el desarrollo y analisis de un Sistema Hibrido en lo que refiere a su diseno, construccion, operacion y mantenimiento, resulta optimo efectuar la simulacion de este por medio de un software, con lo que se obtiene una reduccien

  10. Experimental Analysis of Browser based Novel Anti-Phishing System Tool at Educational Level

    Rajendra Gupta


    Full Text Available In the phishing attack, the user sends their confidential information on mimic websites and face the financial problem, so the user should be informed immediately about the visiting website. According to the Third Quarter Phishing Activity Trends Report, there are 55,282 new phishing websites have been detected in the month of July 2014. To solve the phishing problem, a browser based add-on system may be one of the best solution to aware the user about the website type. In this paper, a novel browser based add-on system is proposed and compared its performance with the existing antiphishing tools. The proposed anti-phishing tool ‘ePhish’ is compared with the existing browser based antiphishing toolbars. All the anti-phishing tools have been installed in computer systems at an autonomous college to check their performance. The obtained result shows that if the task is divided into a group of systems, it can give better results. For different phishing features, the add-on system tool show around 97 percentage successful results at different case conditions. The current study would be very helpful to countermeasure the phishing attach and the proposed system is able to protect the user by phishing attacks. Since the system tool is capable of handling and managing the phishing website details, so it would be helpful to identify the category of the websites.

  11. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix


    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  12. Plato (power load analysis tool) - a module of west wall monitoring system

    The mandate of the WEST (W Environment for Steady-state Tokamak) project, is to upgrade the medium- sized superconducting Tokamak, Tore Supra in a major scale. One of it's objectives, is to also act as a test-bed for ITER divertor components, to be procured and used in ITER. WEST would be installing actively cooled Tungsten divertor elements, like the ones to be used in ITER. These components would be tested under two experimental scenarios: high power (Ip = 0.8MA, lasting 30s with 15MW injected power) and high fluence (Ip = 0.6 MA, lasting 1000s with 12 MW injected power). Heat load on the divertor target will range from a few MW/m2 up to 20 MW/m2 depending on the X point location and the heat flux decay length. The tungsten Plasma Facing Components (PFCs) are less tolerant to overheating than their Carbon counterparts and prevention of their burnout is a major concern. It is in this context that the Wall Monitoring System (WMS) - a software framework aimed at monitoring the health of the Wall components, was conceived. WMS has been divided into three parts: a) a pre-discharge power load analysis tool to check compatibility between plasma scenario and PFC's operational limits in terms of heat flux b) a real-time system during discharge, to take into account all necessary measurements involved in the PFCs protection c) a set of analysis tools that would be used post-discharge, that would access WEST database and compare predicted and experimental results. This paper presents an overview of PLATo - the pre-pulse module of WMS that has been recently developed under IPR-IRFM research collaboration. PLAto has two major components - one that produces heat flux information of the PFCS and the other that produces energy graphs depending on shot profile defined by time variant magnetic equilibrium and injected power profiles. Preliminary results will be presented based on foreseen WEST plasma reference scenarios. (author)

  13. A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study

    Mukhopadhyay, Vivek


    An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.

  14. Exploration tools in formal concept analysis

    Stumme, Gerd


    The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.

  15. Avionics System Architecture Tool

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian


    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  16. Frequency Response Analysis Tool

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.


    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  17. Neutron multiplicity analysis tool

    Stewart, Scott L [Los Alamos National Laboratory


    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  18. Social Data Analysis Tool

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;


    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract......, analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  19. Pattern entropy a tool for nonlinear dynamics analysis of a biological nonstationary system: the human heart

    Tools for a nonlinear analysis of the dynamics of the rhythm of the human heart are discussed. Three-dimensional images in the phase space are formed by means of the Takens trajectory reconstruction method of 24-h sequences of time intervals between heart beats (RR intervals). Best projections of these images are sought and a surprising high symmetry is found for some types of pathology. The effects of filtering of arrhythmias on the symmetry is demonstrated. Images of RR intervals are also made in time window of 100 -400 beats and examples of such images preceding cardiac death are given. A new quantitative tool for the analysis of the local time degree of ordering of RR sequences - pattern entropy - is briefly discussed. (author)

  20. Pattern entropy a tool for nonlinear dynamics analysis of a biological nonstationary system: the human heart

    Zebrowski, J.J. [Warsaw Univ. of Technology, Inst. of Physics, Warsaw (Poland); Poplawska, W.; Baranowski, R. [National Inst. of Cardiology, Warsaw (Poland)


    Tools for a nonlinear analysis of the dynamics of the rhythm of the human heart are discussed. Three-dimensional images in the phase space are formed by means of the Takens trajectory reconstruction method of 24-h sequences of time intervals between heart beats (RR intervals). Best projections of these images are sought and a surprising high symmetry is found for some types of pathology. The effects of filtering of arrhythmias on the symmetry is demonstrated. Images of RR intervals are also made in time window of 100 -400 beats and examples of such images preceding cardiac death are given. A new quantitative tool for the analysis of the local time degree of ordering of RR sequences - pattern entropy - is briefly discussed. (author). 7 refs, 8 figs, 1 tab.

  1. A Collaborative Analysis Tool for Thermal Protection Systems for Single Stage to Orbit Launch Vehicles

    Alexander, Reginald; Stanley, Thomas Troy


    Presented is a design tool and process that connects several disciplines which are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to all other systems, as is the case with SSTO vehicles with air breathing propulsion, which is currently being studied by the National Aeronautics and Space Administration (NASA). In particular, the thermal protection system (TPS) is linked directly to almost every major system. The propulsion system pushes the vehicle to velocities on the order of 15 times the speed of sound in the atmosphere before pulling up to go to orbit which results in high temperatures on the external surfaces of the vehicle. Thermal protection systems to maintain the structural integrity of the vehicle must be able to mitigate the heat transfer to the structure and be lightweight. Herein lies the interdependency, in that as the vehicle's speed increases, the TPS requirements are increased. And as TPS masses increase the effect on the propulsion system and all other systems is compounded. To adequately calculate the TPS mass of this type of vehicle several engineering disciplines and analytical tools must be used preferably in an environment that data is easily transferred and multiple iterations are easily facilitated.

  2. Addressing the alarm analysis barrier - a tool for improving alarm systems

    This paper describes a software application tool for the initial specification and maintenance of the thousands of alarms in nuclear and other process control plants. The software program is used by system designers and maintainers to analyze, characterize, record and maintain the alarm information and configuration decisions for an alarm system. The tool provides a comprehensive design and information handling environment for: the existing alarm functions in current CANDU plants; the new alarm processing and presentation concepts developed under CANDU Owners Group (COG) sponsorship that are available to be applied to existing CANDU plants on a retrofit basis; and, the alarm functions to be implemented in new CANDU plants. (author). 3 refs., 1 fig

  3. Environmental control and life support system analysis tools for the Space Station era

    Blakely, R. L.; Rowell, L. F.


    This paper describes the concept of a developing emulation, simulation, sizing, and technology assessment program (ESSTAP) which can be used effectively for the various functional disciplines (structures, power, ECLSS, etc.) beginning with the initial system selection and conceptual design processes and continuing on through the mission operation and growth phases of the Space Station for the purpose of minimizing overall program costs. It will discuss the basic requirements for these tools, as currently envisioned for the Environmental Control and Life Support System (ECLSS), identifying their intended and potential uses and applications, and present examples and status of several representative tools. The development and applications of a Space Station Atmospheric Revitalization Subsystem (ARS) demonstration model to be used for concent verification will also be discussed.

  4. Motion analysis systems as optimization training tools in combat sports and martial arts

    Ewa Polak


    Full Text Available Introduction: Over the past years, a few review papers about possibilities of using motion analysis systems in sport were published, but there are no articles that discuss this problem in the field of combat sports and martial arts. Aim: This study presents the diversity of contemporary motion analysis systems both, those that are used in scientific research, as well as those that can be applied in daily work of coaches and athletes in combat sports and martial arts. An additional aim is the indication of example applications in scientific research and range of applications in optimizing the training process. It presents a brief description of each type of systems that are currently used in sport, specific examples of systems and the main advantages and disadvantages of using them. The presentation and discussion takes place in the following sections: motion analysis utility for combat sports and martial arts, systems using digital video and systems using markers, sensors or transmitters. Conclusions: Not all types of motion analysis systems used in sport are suitable for combat sports and martial arts. Scientific studies conducted so far showed the usefulness of video-based, optical and electromechanical systems. The use of research results made with complex motion analysis systems, or made with simple systems, local application and immediate visualization is important for the preparation of training and its optimization. It may lead to technical and tactical improvement in athletes as well as the prevention of injuries in combat sports and martial arts.

  5. Energy-Sustainable Framework and Performance Analysis of Power Scheme for Operating Systems: A Tool

    G. Singh


    Full Text Available Recently, an Information and Communications Technology (ICT devices has become more user-friendly, which raised the problem of power dissipation across the globe and computer systems are one among them. This emerging issue of power dissipation has imposed a very significant issue on the system and software design. The concept of ‘green computing’ gaining popularity and is being considered as one of the most promising technology by the designers of Information Technology (IT industry, which demonstrate the environmentally responsible way to reduce the power consumption and maximize the energy efficiency. In this paper, we have proposed an energy sustainable framework of the power schemes for operating systems to reduce the power consumption by computer systems and presented a Green Power tool (GP tool. This tool is designed using JAVA technology, which requires least configuration to make a decision for reducing the power consumption and proposed Swift mode algorithm, allows users to input the working time of their choice then after the end of time algorithm starts detection of human activity on the computer system. We also compared the Swift mode algorithm with existing power scheme in the operating system that provides up to 66% of the power saving. Finally, we have profiled the proposed framework to analyze the memory and Central Processing Unit (CPU performance, which demonstrated that there is no memory leakage or CPU degradation problem and framework’s behavior remain constant under various overhead scenarios of the memory as well as CPU. The proposed framework requires 3–7 MB memory space during its execution.

  6. Tools for Embedded Computing Systems Software


    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  7. Atlas Distributed Analysis Tools

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich


    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  8. ATLAS Distributed Analysis Tools

    Gonzalez de la Hoz, Santiago; Liko, Dietrich


    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  9. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)


    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  10. Modeling and control system design and analysis tools for flexible structures

    Anissipour, Amir A.; Benson, Russell A.; Coleman, Edward E.


    Described here are Boeing software tools used for the development of control laws of flexible structures. The Boeing Company has developed a software tool called Modern Control Software Package (MPAC). MPAC provides the environment necessary for linear model development, analysis, and controller design for large models of flexible structures. There are two features of MPAC which are particularly appropriate for use with large models: (1) numerical accuracy and (2) label-driven nature. With the first feature MPAC uses double precision arithmetic for all numerical operations and relies on EISPAC and LINPACK for the numerical foundation. With the second feature, all MPAC model inputs, outputs, and states are referenced by user-defined labels. This feature allows model modification while maintaining the same state, input, and output names. In addition, there is no need for the user to keep track of a model variable's matrix row and colunm locations. There is a wide range of model manipulation, analysis, and design features within the numerically robust and flexible environment provided by MPAC. Models can be built or modified using either state space or transfer function representations. Existing models can be combined via parallel, series, and feedback connections; and loops of a closed-loop model may be broken for analysis.

  11. Bifurcation Tools for Flight Dynamics Analysis and Control System Design Project

    National Aeronautics and Space Administration — Modern bifurcation analysis methods have been proposed for investigating flight dynamics and control system design in highly nonlinear regimes and also for the...

  12. Integration of Systems Network (SysNet) tools for regional land use scenario analysis in Asia

    Roetter, R.P.; Hoanh, C.T.; Laborte, A.G.; Keulen, van H.; Ittersum, van M.K.; Dreiser, C.; Diepen, van C.A.; Ridder, de N.; Laar, van H.H.


    This paper introduces the approach of the Systems research Network (SysNet) for land use planning in tropical Asia with a focus on its main scientific-technical output: the development of the land use planning and analysis system (LUPAS) and its component models. These include crop simulation models

  13. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    Kinter, James L., III


    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science dataset browsing, sampling and manipulation. The system will be coupled to a supercomputer in a distributed computing environment for near real-time interaction between scientists and computational results.

  14. The NEPLAN software package a universal tool for electric power systems analysis

    Kahle, K


    The NEPLAN software package has been used by CERN's Electric Power Systems Group since 1997. The software is designed for the calculation of short-circuit currents, load flow, motor start, dynamic stability, harmonic analysis and harmonic filter design. This paper describes the main features of the software package and their application to CERN's electric power systems. The implemented models of CERN's power systems are described in detail. Particular focus is given to fault calculations, harmonic analysis and filter design. Based on this software package and the CERN power network model, several recommendations are given.

  15. Java Radar Analysis Tool

    Zaczek, Mariusz P.


    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  16. Dynamic Hurricane Data Analysis Tool

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.


    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  17. Collaborative Analysis Tool for Thermal Protection Systems for Single Stage to Orbit Launch Vehicles

    Alexander, Reginald Andrew; Stanley, Thomas Troy


    Presented is a design tool and process that connects several disciplines which are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to every other system and in the case of SSTO vehicles with air breathing propulsion, which is currently being studied by the National Aeronautics and Space Administration (NASA); the thermal protection system (TPS) is linked directly to almost every major system. The propulsion system pushes the vehicle to velocities on the order of 15 times the speed of sound in the atmosphere before pulling up to go to orbit which results high temperatures on the external surfaces of the vehicle. Thermal protection systems to maintain the structural integrity of the vehicle must be able to mitigate the heat transfer to the structure and be lightweight. Herein lies the interdependency, in that as the vehicle's speed increases, the TPS requirements are increased. And as TPS masses increase the effect on the propulsion system and all other systems is compounded. To adequately determine insulation masses for a vehicle such as the one described above, the aeroheating loads must be calculated and the TPS thicknesses must be calculated for the entire vehicle. To accomplish this an ascent or reentry trajectory is obtained using the computer code Program to Optimize Simulated Trajectories (POST). The trajectory is then used to calculate the convective heat rates on several locations on the vehicles using the Miniature Version of the JA70 Aerodynamic Heating Computer Program (MINIVER). Once the heat rates are defined for each body point on the vehicle, then insulation thickness that are required to maintain the vehicle within structural limits are calculated using Systems Improved Numerical Differencing Analyzer (SINDA) models. If the TPS masses are too heavy for the performance of the vehicle the process may be repeated altering the trajectory or some other input to reduce

  18. Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System

    Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.


    The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.

  19. MOSES - A modelling tool for the analysis of scenarios of the European electricity supply system

    Weitemeyer, S.; Feck, T.; Agert, C.


    Recent studies have shown that a transition of the current power supply system in Europe to a system almost entirely based on fluctuating Renewable Energy Sources (RES) by mid-century is possible. However, most of these scenarios require a significant amount of back-up power capacities to ensure the security of electricity supply. This would imply high additional investments and operating costs. Hence, alternative options should be investigated first. Here we present a first outlook of our simulation model MOSES which will be able to analyse different target states of the European electricity system in 2050. In this model long-term meteorological data series are used to optimise the capacity mix of RES in Europe. One of the main elements of our tool is a simplified electricity network. In addition, alternative options for reduction of additional back-up power like the expansion of the transmission grid, the use of demand-side management and/or the installation of over-capacities will be implemented. The results will be used to provide scientifically proven recommendations to policy makers for a reliable energy supply system in Europe based on Renewable Energy Sources.

  20. Extending a teleradiology system by tools for 3D-visualization and volumetric analysis through a plug-in mechanism.

    Evers, H; Mayer, A; Engelmann, U; Schröter, A; Baur, U; Wolsiffer, K; Meinzer, H P


    This paper describes ongoing research concerning interactive volume visualization coupled with tools for volumetric analysis. To establish an easy to use application, the 3D-visualization has been embedded in a state of the art teleradiology system, where additional functionality is often desired beyond basic image transfer and management. Major clinical requirements for deriving spatial measures are covered by the tools, in order to realize extended diagnosis support and therapy planning. Introducing the general plug-in mechanism this work exemplarily describes the useful extension of an approved application. Interactive visualization was achieved by a hybrid approach taking advantage of both the precise volume visualization based on the Heidelberg Raytracing Model and the graphics acceleration of modern workstations. Several tools for volumetric analysis extend the 3D-viewing. They offer 3D-pointing devices to select locations in the data volume, measure anatomical structures or control segmentation processes. A haptic interface provides a realistic perception while navigating within the 3D-reconstruction. The work is closely related to research work in the field of heart, liver and head surgery. In cooperation with our medical partners the development of tools as presented proceed the integration of image analysis into clinical routine. PMID:10384617

  1. Safety analysis and review system: a Department of Energy safety assurance tool

    The concept of the Safety Analysis and Review System is not new. It has been used within the Department and its predecessor agencies, Atomic Energy Commission (AEC) and Energy Research and Development Administration (ERDA), for over 20 years. To minimize the risks from nuclear reactor and power plants, the AEC developed a process to support management authorization of each operation through identification and analysis of potential hazards and the measures taken to control them. As the agency evolved from AEC through ERDA to the Department of Energy, its responsibilities were broadened to cover a diversity of technologies, including those associated with the development of fossil, solar, and geothermal energy. Because the safety analysis process had proved effective in a technology of high potential hazard, the Department investigated the applicability of the process to the other technologies. This paper describes the system and discusses how it is implemented within the Department

  2. Tool calibration system for micromachining system

    Miller, Donald M.


    A tool calibration system including a tool calibration fixture and a tool height and offset calibration insert for calibrating the position of a tool bit in a micromachining tool system. The tool calibration fixture comprises a yokelike structure having a triangular head, a cavity in the triangular head, and a port which communicates a side of the triangular head with the cavity. Yoke arms integral with the triangular head extend along each side of a tool bar and a tool head of the micromachining tool system. The yoke arms are secured to the tool bar to place the cavity around a tool bit which may be mounted to the end of the tool head. Three linear variable differential transformer's (LVDT) are adjustably mounted in the triangular head along an X axis, a Y axis, and a Z axis. The calibration insert comprises a main base which can be mounted in the tool head of the micromachining tool system in place of a tool holder and a reference projection extending from a front surface of the main base. Reference surfaces of the calibration insert and a reference surface on a tool bar standard length are used to set the three LVDT's of the calibration fixture to the tool reference position. These positions are transferred permanently to a mastering station. The tool calibration fixture is then used to transfer the tool reference position of the mastering station to the tool bit.

  3. Trigonometric regressive spectral analysis: an innovative tool for evaluating the autonomic nervous system.

    Ziemssen, Tjalf; Reimann, Manja; Gasch, Julia; Rüdiger, Heinz


    Biological rhythms, describing the temporal variation of biological processes, are a characteristic feature of complex systems. The analysis of biological rhythms can provide important insights into the pathophysiology of different diseases, especially, in cardiovascular medicine. In the field of the autonomic nervous system, heart rate variability (HRV) and baroreflex sensitivity (BRS) describe important fluctuations of blood pressure and heart rate which are often analyzed by Fourier transformation. However, these parameters are stochastic with overlaying rhythmical structures. R-R intervals as independent variables of time are not equidistant. That is why the trigonometric regressive spectral (TRS) analysis--reviewed in this paper--was introduced, considering both the statistical and rhythmical features of such time series. The data segments required for TRS analysis can be as short as 20 s allowing for dynamic evaluation of heart rate and blood pressure interaction over longer periods. Beyond HRV, TRS also estimates BRS based on linear regression analyses of coherent heart rate and blood pressure oscillations. An additional advantage is that all oscillations are analyzed by the same (maximal) number of R-R intervals thereby providing a high number of individual BRS values. This ensures a high confidence level of BRS determination which, along with short recording periods, may be of profound clinical relevance. The dynamic assessment of heart rate and blood pressure spectra by TRS allows a more precise evaluation of cardiovascular modulation under different settings as has already been demonstrated in different clinical studies. PMID:23812502

  4. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    Lausberg, Hedda; Sloetjes, Han


    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods. PMID:26428913

  5. State-of-the-art Tools and Techniques for Quantitative Modeling and Analysis of Embedded Systems

    Bozga, Marius; David, Alexandre; Hartmanns, Arnd;


    This paper surveys well-established/recent tools and techniques developed for the design of rigorous embedded sys- tems. We will first survey U PPAAL and M ODEST, two tools capable of dealing with both timed and stochastic aspects. Then, we will overview the BIP framework for modular design...

  6. On the use of financial analysis tools for the study of Dst time series in the frame of complex systems

    Potirakis, Stelios M; Balasis, Georgios; Eftaxias, Konstantinos


    Technical analysis is considered the oldest, currently omnipresent, method for financial markets analysis, which uses past prices aiming at the possible short-term forecast of future prices. In the frame of complex systems, methods used to quantitatively analyze specific dynamic phenomena are often used to analyze phenomena from other disciplines on the grounds that are governed by similar dynamics. An interesting task is the forecast of a magnetic storm. The hourly Dst is used as a global index for the monitoring of Earth's magnetosphere, which could be either in quiet (normal) or in magnetic storm (pathological) state. This work is the first attempt to apply technical analysis tools on Dst time series, aiming at the identification of indications which could be used for the study of the temporal evolution of Earth's magnetosphere state. We focus on the analysis of Dst time series around the occurrence of magnetic storms, discussing the possible use of the resulting information in the frame of multidisciplina...

  7. Spektr: A computational tool for x-ray spectral analysis and imaging system optimization

    A set of computational tools are presented that allow convenient calculation of x-ray spectra, selection of elemental and compound filters, and calculation of beam quality characteristics, such as half-value layer, mR/mAs, and fluence per unit exposure. The TASMIP model of Boone and Seibert is adapted to a library of high-level language (MatlabTM) functions and shown to agree with experimental measurements across a wide range of kVp and beam filtration. Modeling of beam filtration is facilitated by a convenient, extensible database of mass and mass-energy attenuation coefficients compiled from the National Institute of Standards and Technology. The functions and database were integrated in a graphical user interface and made available online at The functionality of the toolset and potential for investigation of imaging system optimization was illustrated in theoretical calculations of imaging performance across a broad range of kVp, filter material type, and filter thickness for direct and indirect-detection flat-panel imagers. The calculations reveal a number of nontrivial effects in the energy response of such detectors that may not have been guessed from simple K-edge filter techniques, and point to a variety of compelling hypotheses regarding choice of beam filtration that warrant future investigation

  8. ADAPT-A Drainage Analysis Planning Tool

    Boelee, Leonore; Kellagher, Richard


    HR Wallingford are a partner in the EU funded TRUST project. They are involved in Work package 4.3 Wastewater and stormwater systems, to produce a model and report on a system sustainability analysis and potential for improvements for stormwater systems as Deliverable 4.3.2. This report is deliverable 4.3.2. It details the development of the tool ADAPT (A Drainage Analysis and Planning Tool). The objective of the tool is to evaluate the improvement requirements to a stormwat...

  9. Performance assessment of the Tactical Network Analysis and Planning System Plus (TNAPS+) automated planning tool for C4I systems

    Ziegenfuss, Paul C.


    The Joint Staff established the Tactical Network Analysis and Planning System Plus (TNAPS+) as the interim joint communications planning and management system. The Marines Command and Control Systems Course and the Army's Joint Task Force System Planning Course both utilize TNAPS+ to conduct tactical C41 network planning in their course requirements. This thesis is a Naval Postgraduate School C41 curriculum practical application of TNAPS+ in an expeditionary Joint Task Force environment, focu...

  10. NOAA's Inundation Analysis Tool

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  11. Ground Data System Analysis Tools to Track Flight System State Parameters for the Mars Science Laboratory (MSL) and Beyond

    Allard, Dan; Deforrest, Lloyd


    Flight software parameters enable space mission operators fine-tuned control over flight system configurations, enabling rapid and dynamic changes to ongoing science activities in a much more flexible manner than can be accomplished with (otherwise broadly used) configuration file based approaches. The Mars Science Laboratory (MSL), Curiosity, makes extensive use of parameters to support complex, daily activities via commanded changes to said parameters in memory. However, as the loss of Mars Global Surveyor (MGS) in 2006 demonstrated, flight system management by parameters brings with it risks, including the possibility of losing track of the flight system configuration and the threat of invalid command executions. To mitigate this risk a growing number of missions have funded efforts to implement parameter tracking parameter state software tools and services including MSL and the Soil Moisture Active Passive (SMAP) mission. This paper will discuss the engineering challenges and resulting software architecture of MSL's onboard parameter state tracking software and discuss the road forward to make parameter management tools suitable for use on multiple missions.

  12. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    Liu, Nan-Suey; Quealy, Angela


    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  13. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    Chatterjee, Sharmista


    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  14. The System Cost Model: A tool for life cycle cost and risk analysis

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors began development of the System Cost Model (SCM) application. The SCM estimates life cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, and transuranic waste. The SCM uses parametric cost functions to estimate life cycle costs for various treatment, storage, and disposal modules which reflect planned and existing waste management facilities at DOE installations. In addition, SCM can model new TSD facilities based on capacity needs over the program life cycle. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction, operations and maintenance, and decommissioning these waste management facilities. The SCM also provides transportation costs for DOE wastes. Transportation costs are provided for truck and rail and include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. A complement to the SCM is the System Cost Model-Risk (SCM-R) model, which provides relative Environmental, Safety, and Health (ES and H) risk information. A relative ES and H risk basis has been developed and applied by LITCO at the INEL. The risk basis is now being automated in the SCM-R to facilitate rapid risk analysis of system alternatives. The added risk functionality will allow combined cost and risk evaluation of EM alternatives

  15. TANGO control system management tool

    TANGO is an object oriented control system tool kit based on CORBA initially developed at ESRF. It is now also developed and used by other synchrotron radiation sources. The TANGO concept is a fully distributed object oriented control system. That means that several processes (called servers) are running on many different hosts. Each server manages one or several TANGO classes. Each class could have one or several instances (call devices). On each host to be controlled, a device server (called Starter) takes care of all device servers running (or supposed to be running) on this machine. The controlled server list is read from the TANGO database. A graphical client (called Astor) is connected to all Starter servers and is able to: -) display the control system status and component status using coloured icons; -) execute actions on components (start, stop, test, configure, display information,...), -) execute diagnostics on components, and -) execute global analysis on large number of crates or database. The couple Starter/Astor and related tools, are very useful to manage a large number of servers running on several hosts distributed around an accelerator

  16. Understanding Earthquake Fault Systems Using QuakeSim Analysis and Data Assimilation Tools

    Donnellan, Andrea; Parker, Jay; Glasscoe, Margaret; Granat, Robert; Rundle, John; McLeod, Dennis; Al-Ghanmi, Rami; Grant, Lisa


    We are using the QuakeSim environment to model interacting fault systems. One goal of QuakeSim is to prepare for the large volumes of data that spaceborne missions such as DESDynI will produce. QuakeSim has the ability to ingest distributed heterogenous data in the form of InSAR, GPS, seismicity, and fault data into various earthquake modeling applications, automating the analysis when possible. Virtual California simulates interacting faults in California. We can compare output from long time history Virtual California runs with the current state of strain and the strain history in California. In addition to spaceborne data we will begin assimilating data from UAVSAR airborne flights over the San Francisco Bay Area, the Transverse Ranges, and the Salton Trough. Results of the models are important for understanding future earthquake risk and for providing decision support following earthquakes. Improved models require this sensor web of different data sources, and a modeling environment for understanding the combined data.

  17. OCAM - A CELSS modeling tool: Description and results. [Object-oriented Controlled Ecological Life Support System Analysis and Modeling

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray


    Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.

  18. VCAT: Visual Crosswalk Analysis Tool

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory


    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  19. Security EvaBio: An Analysis Tool for the Security Evaluation of Biometric Authentication Systems

    El-Abed, Mohamad; Lacharme, Patrick; Rosenberger, Christophe


    Biometric systems present several drawbacks that may significantly decrease their utility. Nowadays, several platforms (such as the FVC-onGoing) exist to assess the performance of such systems. Despite this, none platform exists for the security evaluation of biometric systems. Hence, the aim of this paper is to present an on-line platform for the security evaluation of biometric systems. The key benefits of the presented platform are twofold. First, it provides biometrics community an evalua...

  20. Analysis of simulation tools for the study of advanced marine power systems

    Brochard, Paul Eugene


    The United States Navy is at a crossroads in the design of ship's engineering plants. Advances in solid-state power electronics combined with a shift to gas turbine powered propulsion and electric plants has placed renewed emphasis on developing advanced power systems. These advanced power systems may combine the prime movers associated with propulsion and electric power generation into an integrated system. The development of advanced electric distribution systems and propulsion derived ship...

  1. Bifurcation Tools for Flight Dynamics Analysis and Control System Design Project

    National Aeronautics and Space Administration — The purpose of the project is the development of a computational package for bifurcation analysis and advanced flight control of aircraft. The development of...

  2. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives

  3. Advanced Numerical Tools for Design and Analysis of In-Space, Valve and Feed Systems Project

    National Aeronautics and Space Administration — In-space valves for the main fuel and oxidizer feed systems are required to provide precise control, wide throttling range and handle rapid on-off control. These...

  4. Characterization of components of water supply systems from GPR images and tools of intelligent data analysis.

    Ayala Cabrera, David


    [EN] Over time, due to multiple operational and maintenance activities, the networks of water supply systems (WSSs) undergo interventions, modifications or even are closed. In many cases, these activities are not properly registered. Knowledge of the paths and characteristics (status and age, etc.) of the WSS pipes is obviously necessary for efficient and dynamic management of such systems. This problem is greatly augmented by considering the detection and control of leaks. Access to reliable...

  5. Analysis techniques for multivariate root loci. [a tool in linear control systems

    Thompson, P. M.; Stein, G.; Laub, A. J.


    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  6. Physics Analysis Tools Workshop 2007

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  7. An interactive visualization tool for the analysis of multi-objective embedded systems design space exploration

    Taghavi, T.; Pimentel, A.D.


    The design of today’s embedded systems involves a complex Design Space Exploration (DSE) process. Typically, multiple and conflicting criteria (objectives) should be optimized simultaneously such as performance, power, cost, etc. Usually, Multi-Objective Evolutionary Algorithms (MOEAs) are used to explore a large design space with a finite number of design point evaluations, providing the designer a set of tradable solutions with respect to the design criteria. Analyzing how such evolutionary...

  8. Tools & Strategies for Social Data Analysis

    Willett, Wesley Jay


    Data analysis is often a complex, iterative process that involves a variety of stakeholders and requires a range of technical and professional competencies. However, in practice, tools for visualizing,analyzing, and communicating insights from data have primarily been designed to support individual users.In the past decade a handful of research systems like and Many Eyes have begun to explore how web-based visualization tools can allow larger groups of users to participate in analyse...

  9. Physics Analysis Tools Workshop Report

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  10. Common Analysis Tool Being Developed for Aeropropulsion: The National Cycle Program Within the Numerical Propulsion System Simulation Environment

    Follen, Gregory J.; Naiman, Cynthia G.


    The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.


    ÖZYURT, Hacer; ÖZYURT, Özcan


    Natural Language Processing (NLP) is one of the most important research and application fields in artificial intelligence. According to known the most important tool of human-computer interaction is language, in the near future the language will be an important tool for human-computer interaction. In this sense, the rules of a language must be known and must be analyzed by computer. In this study, a Turkish text based dialogue system has been developed to ensure human-computer interaction. Th...

  12. Tools for income mobility analysis

    Philippe Kerm


    A set of Stata routines to help analysis of `income mobility' are presented and illustrated. Income mobility is taken here as the pattern of income change from one time period to another within an income distribution. Multiple approaches have been advocated to assess the magnitude of income mobility. The macros presented provide tools for estimating several measures of income mobility, e.g. the Shorrocks (JET 1978) or King (Econometrica 1983) indices or summary statistics for transition matri...

  13. Performance analysis of GYRO: a tool evaluation

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  14. The CANDU alarm analysis tool (CAAT)

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  15. User's Guide and Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information

    ABSTRACTUser’s Guide & Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information(EPA/601/B-15/001, 2015, 123 pages)Henry Lee II, U.S. EPA, Western Ecology DivisionKatharine Marko, U.S. EPA,...

  16. The methods and tools for system analysis of surface heat exchangers of steam-gas turbine and oil-electrical energy installations

    ГАНЖА, А. Н.; Марченко, Н. А.


    The methods and tools for system analysis of heat-exchange equipments of steam-gas turbine and oil-electrical energy installations are developed. The methodic and dependences can use for solution of optimization tasks. The apparatus effectiveness depending on surface composition and generalized parameters that reflects of heat-transfer rates, relation of heat carrier expense, operational and technological factors are investigated.

  17. Studying international fuel cycle robustness with the GENIUSv2 discrete facilities/materials fuel cycle systems analysis tool

    GENIUSv2 (Global Evaluation of Nuclear Infrastructure Utilization Scenarios, hereafter 'GENIUS') is a discrete-facilities/materials nuclear fuel cycle systems analysis tool currently under development at the University of Wisconsin-Madison. For a given scenario, it models nuclear fuel cycle facilities (reactors, fuel fabrication, enrichment, etc.), the institutions that own them (utilities and governments), and the regions in which those institutions operate (sub-national, national, and super-national entities). Facilities work together to provide each other with the materials they need. The results of each simulation include the electricity production in each region as well as operational histories of each facility and isotopic and facility histories of each material object. GENIUS users specify an initial condition and a facility deployment plan. The former describes each region and institution in the scenario as well as facilities that exist at the start. The latter specifies all the facilities that will be built over the course of the simulation (and by which institutions). Each region, institution, and facility can be assigned financial parameters such as tax and interest rates, and facilities also get assigned technical information about how they actually operate. Much of the power of the data model comes from the flexibility to model individual entities to a fine level of detail or to allow them to inherit region-, institution-, or facility-type-specific default parameters. Most importantly to the evaluation of regional, national, and international policies, users can also specify rules that define the affinity (or lack thereof) for trade of particular commodities between particular entities. For instance, these rules could dictate that a particular region or institution always buy a certain commodity (ore, enriched UF6, fabricated fuel, etc.) from a particular region or institution, never buy from that region, or merely have a certain likelihood to do so

  18. Development Methodology of a Cyber Security Risk Analysis and Assessment Tool for Digital I and C Systems in Nuclear Power Plant

    With the use of digital computers and communication networks the hot issues on cyber security were raised about 10 years ago. The scope of cyber security application has now been extended from the safety Instrumentation and Control (I and C) system to safety important systems, plant security system, and emergency preparedness system. Therefore, cyber security should be assessed and managed systematically throughout the development life cycle of I and C systems in order for their digital assets to be protected from cyber attacks. Fig. 1 shows the concept of a cyber security risk management of digital I and C systems in nuclear power plants (NPPs). A lot of cyber security risk assessment methods, techniques, and supported tools have been developed for Information Technology (IT) systems, but they have not been utilized widely for cyber security risk assessments of the digital I and C systems in NPPs. The main reason is a difference in goals between IT systems and nuclear I and C systems. Confidentiality is important in IT systems, but availability and integrity are important in nuclear I and C systems. Last year, it was started to develop a software tool to be specialized for the development process of nuclear I and C systems. This paper presents a development methodology of the Cyber Security Risk analysis and Assessment Tool (CSRAT) for the digital I and C systems in NPP

  19. A Simulation of Energy Storage System for Improving the Power System Stability with Grid-Connected PV using MCA Analysis and LabVIEW Tool

    Jindrich Stuchly


    Full Text Available The large-scale penetration of distributed, Renewable power plants require transfers of large amounts of energy. This, in turn, puts a high strain on the energy delivery infrastructure. In particular, photovoltaic power plants supply energy with high intermittency, possibly affecting the stability of the grid by changing the voltage at the plant connection point. In this contribution, we summarize the main negative effects of selected and real-operated grid connected photovoltaic plant. Thereafter a review of suitable Energy storage systems to mitigate the negative effects has been carried out, compared and evaluated using Multi-criterion analysis. Based on this analysis, data collected at the plant and the grid, are used to design the energy storage systems to support connection of the plant to the grid. The cooperation of these systems is then analysed and evaluated using simulation tools created in LabVIEW for this purpose. The simulation results demonstrate the capability of energy storage system solutions to significantly reduce the negative feedback effects of Photovoltaic Power Plan to the low voltage grid.

  20. From sensor networks to connected analysis tools

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.


    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx ( provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  1. Analysis of Effect on Clipping Mechanism and Structure of Tool System for High-speed Machine Tool%高速机床工具系统结构及其夹紧特性的研究

    张志梅; 安虎平; 王锐锋; 芮志元


    针对高速机床对工具系统的要求和实际存在的问题,探讨了工具系统刀柄截面形状和结构选择的关键技术,确定合理的定位方案为端面锥柄定位,刀柄采用小锥度的空心短锥结构.结果表明,HSK刀柄定位准确、连接可靠、系统刚度高、具有良好的抗振性及结构工艺性.通过对工具夹紧机构受力的特性分析和放大系数的计算,给出了影响夹紧力和夹紧效果的因素及其变化规律,可为工具系统的设计提供参考.%Aiming at the requirement of high-speed machine tool for tool system and the existing question, it treats the key technique for opting the section shapes and structure of tool handle in high speed tool system. The best project of locating is the iteration location mode with end - plane and taper at the same time, and with the tool handle whose structure uses short hollow cone with small taper. Experiment shows that this structure is able to guarantee accurate location, reliable collection, high rigidity of system, good anti - vibration and structural property to process. Through analysis of characteristic and calculation of magnification coefficient for force in clamp mechanism, it presents the factors and their variable laws that affect force and effect of clamping tool, which can be refered for design of tool system.

  2. Reload safety analysis automation tools

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  3. General Mission Analysis Tool (GMAT)

    Hughes, Steven P. (Compiler)


    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  4. SBAT. A stochastic BPMN analysis tool

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter


    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... are specified using the temporal logic Probabilistic Computation Tree Logic (PCTL) and we employ stochastic model checking, by means of the model checker PRISM, to compute their exact values. We present a simplified example of a distributed stochastic system where we determine a reachability property...

  5. Photogrammetry Tool for Forensic Analysis

    Lane, John


    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  6. Flow Injection/Sequential Injection Analysis Systems: Potential Use as Tools for Rapid Liver Diseases Biomarker Study

    Supaporn Kradtap Hartwell


    Full Text Available Flow injection/sequential injection analysis (FIA/SIA systems are suitable for carrying out automatic wet chemical/biochemical reactions with reduced volume and time consumption. Various parts of the system such as pump, valve, and reactor may be built or adapted from available materials. Therefore the systems can be at lower cost as compared to other instrumentation-based analysis systems. Their applications for determination of biomarkers for liver diseases have been demonstrated in various formats of operation but only a few and limited types of biomarkers have been used as model analytes. This paper summarizes these applications for different types of reactions as a guide for using flow-based systems in more biomarker and/or multibiomarker studies.

  7. Space Debris Reentry Analysis Methods and Tools

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe


    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  8. Functional analysis, a resilience improvement tool applied to a waste management system - application to the "household waste management chain"

    Beraud, H.; Barroca, B.; Hubert, G.


    A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site. 1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.). These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005).

  9. Control system for borehole tools

    Bordon, E.E.


    A control assembly is described for use with a tool including one or more subassemblies adapted for controlling and/or monitoring various events within a borehole and actuating instrumentation positioned on the earth's surface for actuating the tool. The assembly comprises: control means connected to the tool for selectively actuating one or more of the subassemblies within the tool, the control means being adapted for operation within the borehole, power supply means connected to the tool for supplying electrical power to the control means for operation thereof independent of the surface actuating instrumentation, communication means connected to the surface actuating instrumentation for communicating therewith, and connection means for selectively connecting the communication means to the control means while the tool and the control means connected thereto are within the borehole to establish communication between the control means and the surface actuating instrumentation. The connection means is adapted for operation within the borehole.

  10. Simulating the Farm Production System Using the MONARC Simulation Tool

    Y.Wu; I.C.Legrand; 等


    The simulation program developed by the "Models of Networked Analysis at Regional Centers"(MONARC) project is a powerful and flexible tool for simulating the behavior of large scale distributed computing systems,In this study,we further validate this simulation tool in a large-scale distributed farm computing system.We also report the usage of this simulation tool to identify the bottlenecks and limitations of our farm system.

  11. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne


    highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices of...

  12. Enhancement of Local Climate Analysis Tool

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.


    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  13. An enhanced MMW and SMMW/THz imaging system performance prediction and analysis tool for concealed weapon detection and pilotage obstacle avoidance

    Murrill, Steven R.; Jacobs, Eddie L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.


    The U.S. Army Research Laboratory (ARL) has continued to develop and enhance a millimeter-wave (MMW) and submillimeter- wave (SMMW)/terahertz (THz)-band imaging system performance prediction and analysis tool for both the detection and identification of concealed weaponry, and for pilotage obstacle avoidance. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). Further development of this tool that includes a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures was reported on at the 2011 SPIE Europe Security and Defence Symposium (Prague). This paper provides a comprehensive review of a newly enhanced MMW and SMMW/THz imaging system analysis and design tool that now includes an improved noise sub-model for more accurate and reliable performance predictions, the capability to account for postcapture image contrast enhancement, and the capability to account for concealment material backscatter with active-illumination- based systems. Present plans for additional expansion of the model's predictive capabilities are also outlined.

  14. STARS software tool for analysis of reliability and safety

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  15. Multi-mission telecom analysis tool

    Hanks, D.; Kordon, M.; Baker, J.


    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  16. A Comparative Analysis of Life-Cycle Assessment Tools for End-of-Life Materials Management Systems

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal s...

  17. Implementation of cutting tool management system

    G. Svinjarević


    Full Text Available Purpose: of this paper is to show the benefits of implementation of management of cutting tools in the company which specializes in metal cutting process, after which the production conditions alows new possibilities for improvement of the tool management.Design/methodology/approach: applied in this paper was identification current state and exploatation conditions of cutting tools on lathes and milling machines and organization of the departments and other services, which are directly involved in the cutting tools management system.Findings: of the controlled testings and analyses in every phase of tool management in departments and other services which are directly involved in the tool management system will help to reduce stock and costs. It is possible to identify which operator makes errors and is responsible for inappropriate use of cutting tool. Some disadvantages have been identified and a few suggestions for the improvement in the tool management system have been given. A result of research is easy to apply in company with developed informatic infrastructure and is mostly interesting for CNC workshops. Small companies and specialized low volume productions have to made additional effort to integrate in clusters.Practical implications: are reduction of cutting tool on stock, reduction of employee, quick access to the necessary cutting tools and data, simplicity in tool order and supply. The most important is possibility to monitor and to identify which cutting tools and employees are weakest parts of chain in tool management system. Management activity should be foreseeable in all its segments, which includes both the appropriate choice and use of cutting tools, and monitoring of unwanted phenomena during the cutting process and usage of these data for further purchase of tools.Originality/value: in the paper is turnover methodology applied for determination of management efficacy and formation of employees from different departments in

  18. Modelling of safety fieldbus system via SW tool SHARPE

    Maria Franekova; Jan Rofar


    Paper deals with the modelling of the safety-related Fieldbus communication system, which has to guaranty Safety Integrity Level (SIL) according to standard IEC 61508. There are methods of safety analysis for the closed safety Fieldbus transmission system summarized. The mainly part the modeling SW tool SHARPE describes. The realized models are based on Fault Tree Analysis (FTA) and Markov analysis.

  19. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    Chin, Jeffrey C.; Csank, Jeffrey T.


    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  20. ISHM Decision Analysis Tool: Operations Concept


    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  1. General Mission Analysis Tool Project

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure,...

  2. Design of Fault Analysis and Diagnosis System in NC Machine Tool%数控机床故障分析与诊断系统的设计



    In this typical faults of CNC machine tools, through careful analysis and research, find out the fault and the relationship between certain signal characteristics, and on the basis of the design of fault diagnosis system, presents a diagnostic system of soft, hardware design, failure to effectively forecast and diagnosis. In practice the effective for NC machine tool maintenance, ensure the normal operation of CN machine tools.%针对数控机床的典型故障,进行认真分析和研究,找出典型故障与某些信号特征间的关系,并在此基础上设计出故障诊断系统,提出了诊断系统的软、硬件设计方案,对故障进行有效的预测与诊断。在实际操作中对数控机床进行有效维护,保证了数控机床的正常运行。

  3. Systems Prototyping with Fourth Generation Tools.

    Sholtys, Phyllis


    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  4. Quick Spacecraft Thermal Analysis Tool Project

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  5. An Automatic Hierarchical Delay Analysis Tool

    FaridMheir-El-Saadi; BozenaKaminska


    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  6. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Tilton Susan C


    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  7. Surface analysis of stone and bone tools

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.


    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  8. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J


    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools. PMID:26677194

  9. Expert systems as decision tools

    The feasibility of using expert systems as an aid in regulatory compliance functions has been investigated. A literature review was carried out to identify applications of expert systems to regulatory affairs. A bibliography of the small literature on such applications was prepared. A prototype system, ARIES, was developed to demonstrate the use of an expert system as an aid to a Project Officer in assuring compliance with licence requirements. The system runs on a personal computer with a graphical interface. Extensive use is made of hypertext to link interrelated rules and requirements as well as to provide an explanation facility. Based on the performance of ARIES the development of a field version is recommended

  10. An Integrated Traverse Planner and Analysis Tool for Planetary Exploration

    Johnson, Aaron William; Hoffman, Jeffrey A.; Newman, Dava; Mazarico, Erwan Matias; Zuber, Maria


    Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. The Massachusetts Institute of Technology is currently developing such a system, called the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT). The goal of this system is twofold: to allow for realistic simulations of traverses in order to assist with har...

  11. Tools for voltage stability analysis, including a probabilistic approach

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)


    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  12. Built Environment Energy Analysis Tool Overview (Presentation)

    Porter, C.


    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  13. The second iteration of the Systems Prioritization Method: A systems prioritization and decision-aiding tool for the Waste Isolation Pilot Plant: Volume 3, Analysis for final programmatic recommendations

    Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories for the US DOE Carlsbad Area Office (DOE/CAO). This tool provides an analytical basis for programmatic decision making for the Waste Isolation Pilot Plant (WIPP). SPM integrates decision-analysis techniques, performance,a nd risk-assessment tools, and advanced information technology. Potential outcomes of proposed activities and combination of activities are used to calculate a probability of demonstrating compliance (PDC) with selected regulations. The results are presented in a decision matrix showing cost, duration, and maximum PDC for all activities in a given cost and duration category. This is the third and final volume in the series which presents the analysis for final programmatic recommendations

  14. The second iteration of the Systems Prioritization Method: A systems prioritization and decision-aiding tool for the Waste Isolation Pilot Plant: Volume 3, Analysis for final programmatic recommendations

    Prindle, N.H.; Boak, D.M.; Weiner, R.F. [and others


    Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories for the US DOE Carlsbad Area Office (DOE/CAO). This tool provides an analytical basis for programmatic decision making for the Waste Isolation Pilot Plant (WIPP). SPM integrates decision-analysis techniques, performance,a nd risk-assessment tools, and advanced information technology. Potential outcomes of proposed activities and combination of activities are used to calculate a probability of demonstrating compliance (PDC) with selected regulations. The results are presented in a decision matrix showing cost, duration, and maximum PDC for all activities in a given cost and duration category. This is the third and final volume in the series which presents the analysis for final programmatic recommendations.

  15. Marine Machinery Systems - Tools and Architecture

    Sandbakken, Egil Christoffer


    The thesis presents tools and architecture regarding design of marine MSs in OSVs. It enlightens important aspects regarding the design based on a research study, and proposes a design methodology consisting of tools and architecture.From the research studies in chapter 2 it becomes clear that the most common propulsion system today for platform supply vessels (PSV) is the diesel-electric (DEL) propulsion system. Other concepts such as; dual fuel engines, Voith Schneider Propellers (VSP), hyb...

  16. PFN tool test and calibration system

    A system has been developed for the functional testing and neutron output calibration of the PFN (Prompt Fission Neutron) Uranium Logging Tool. The system was designed primarily for field work and consists of a special vehicle as well as test apparatus. Only the pertinent instrumentation is described. This document will serve as an Instruction and Test Equipment service manual for those involved with calibration of the neutron output of the PFN tool

  17. Tools for analysis of Dirac structures on banach spaces

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran


    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  18. Interval analysis on non-linear monotonic systems as an efficient tool to optimise fresh food packaging

    Destercke, Sebastien; Guillard, Valérie


    International audience When few data or information are available, the validity of studies performing uncertainty analysis or robust design optimisation (i.e., parameter optimisation under uncertainty) with a probabilistic approach is questionable. This is particularly true in some agronomical fields, where parameter and variable uncertainties are often quantified by a handful of measurements or by expert opinions. In this paper, we propose a simple alternative approach based on interval a...

  19. Applied regression analysis a research tool

    Pantula, Sastry; Dickey, David


    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  20. Accelerator physics analysis with interactive tools

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  1. Harnessing VLSI System Design with EDA Tools

    Kamat, Rajanish K; Gaikwad, Pawan K; Guhilot, Hansraj


    This book explores various dimensions of EDA technologies for achieving different goals in VLSI system design. Although the scope of EDA is very broad and comprises diversified hardware and software tools to accomplish different phases of VLSI system design, such as design, layout, simulation, testability, prototyping and implementation, this book focuses only on demystifying the code, a.k.a. firmware development and its implementation with FPGAs. Since there are a variety of languages for system design, this book covers various issues related to VHDL, Verilog and System C synergized with EDA tools, using a variety of case studies such as testability, verification and power consumption. * Covers aspects of VHDL, Verilog and Handel C in one text; * Enables designers to judge the appropriateness of each EDA tool for relevant applications; * Omits discussion of design platforms and focuses on design case studies; * Uses design case studies from diversified application domains such as network on chip, hospital on...

  2. Statistical Tools for Forensic Analysis of Toolmarks

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser


    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  3. Two energy system analysis - cases

    Lund, Henrik; Antonoff, Jayson; Andersen, Anders N.


    The chapter presents tow cases of energy system analysis, illustrating the types of tools and methodologies presently being used for these studies in Denamrk and elsewhere.......The chapter presents tow cases of energy system analysis, illustrating the types of tools and methodologies presently being used for these studies in Denamrk and elsewhere....

  4. Systems engineering and analysis

    Blanchard, Benjamin S


    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  5. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Mihaela-Laura IVAN; Mircea Raducu TRIFU; Manole VELICANU; Cristian CIUREA


    The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In deta...

  6. A computer aided engineering tool for ECLS systems

    Bangham, Michal E.; Reuter, James L.


    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  7. Ethics Auditing and Conflict Analysis as Management Tools

    Anu Virovere; Merle Rihma


    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  8. A tool for subjective analysis of TTOs

    Resende, David Nunes; Gibson, David V.; Jarrett, James


    The objective of this article is to present a proposal (working paper) for a quantitative analysis tool to help technology transfer offices (TTOs) improve their structures, processes and procedures. Our research started from the study of internal practices and structures that facilitate the interaction between R&D institutions, their TTOs and regional surroundings. We wanted to identify “bottlenecks” in those processes, procedures, and structures. We mapped the bottlenecks in a set of “...

  9. Risk Analysis Based on Performance Criteria: A  Food Safety Control System and Decision-making Tool to Control Salmonella from Whole Broilers

    Alshuniaber, Mohammad A.f.


    Risk analysis is a powerful science-based tool that can be used to control and mitigate microbial food safety hazards. Codex recommends conducting preliminary risk management activities (PRMAs) to initiate risk analysis and to plan the risk assessment process. The information learned from these PRMAs should be utilized to construct a quantitative microbial risk assessment (QMRA) model. Then, risk management activities can utilize the QMRA model to identify and select microbial risk management...

  10. Design tools for complex dynamic security systems.

    Byrne, Raymond Harry; Rigdon, James Brian; Rohrer, Brandon Robinson; Laguna, Glenn A.; Robinett, Rush D. III (.; ); Groom, Kenneth Neal; Wilson, David Gerald; Bickerstaff, Robert J.; Harrington, John J.


    The development of tools for complex dynamic security systems is not a straight forward engineering task but, rather, a scientific task where discovery of new scientific principles and math is necessary. For years, scientists have observed complex behavior but have had difficulty understanding it. Prominent examples include: insect colony organization, the stock market, molecular interactions, fractals, and emergent behavior. Engineering such systems will be an even greater challenge. This report explores four tools for engineered complex dynamic security systems: Partially Observable Markov Decision Process, Percolation Theory, Graph Theory, and Exergy/Entropy Theory. Additionally, enabling hardware technology for next generation security systems are described: a 100 node wireless sensor network, unmanned ground vehicle and unmanned aerial vehicle.

  11. A Collaborative Analysis Tool for Integrated Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Stanley, Thomas Troy; Alexander, Reginald; Landrum, Brian


    the process may be repeated altering the trajectory or some other input to reduce the TPS mass. E-PSURBCC is an "engine performance" model and requires the specification of inlet air static temperature and pressure as well as Mach number (which it pulls from the HYFIM and POST trajectory files), and calculates the corresponding stagnation properties. The engine air flow path geometry includes inlet, a constant area section where the rocket is positioned, a subsonic diffuser, a constant area afterburner, and either a converging nozzle or a converging-diverging nozzle. The current capabilities of E-PSURBCC ejector and ramjet mode treatment indicated that various complex flow phenomena including multiple choking and internal shocks can occur for combinations of geometry/flow conditions. For a given input deck defining geometry/flow conditions, the program first goes through a series of checks to establish whether the input parameters are sound in terms of a solution path. If the vehicle/engine performance fails mission goals, the engineer is able to collaboratively alter the vehicle moldline to change aerodynamics, or trajectory, or some other input to achieve orbit. The problem described is an example of the need for collaborative design and analysis. RECIPE is a cross-platform application capable of hosting a number of engineers and designers across the Internet for distributed and collaborative engineering environments. Such integrated system design environments allow for collaborative team design analysis for performing individual or reduced team studies. To facilitate the larger number of potential runs that may need to be made, RECIPE connects the computer codes that calculate the trajectory data, aerodynamic data based on vehicle geometry, heat rate data, TPS masses, and vehicle and engine performance, so that the output from each tool is easily transferred to the model input files that need it.

  12. Database tools for enhanced analysis of TMX-U data

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  13. Database tools for enhanced analysis of TMX-U data

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  14. JAVA based LCD Reconstruction and Analysis Tools

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  15. Java based LCD reconstruction and analysis tools

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  16. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro


    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)). PMID:27170193

  17. DFTCalc: a tool for efficient fault tree analysis (extended version)

    Arnold, Florian; Belinfante, Axel; Berg, de, MT Mark; Guck, Dennis; Stoelinga, Mariëlle


    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ana...

  18. DFTCalc: a tool for efficient fault tree analysis

    Arnold F.; Belinfante A.; Van Der Berg F.; Guck D.; Stoelinga M.


    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and it is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ...


    Pack, G.


    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  20. LINAC quality assurance tool for EPID systems

    Purpose: There are dual purposes for this work: (1) to establish the clinical feasibility of using a commercial LINAC and scanning-liquid ion chamber electronic portal imaging device (SLIC-EPID) for performing quality control checks; and (2) to demonstrate a convenient software tool for performing the tasks. Specifically, our objectives pertain to beam flatness and symmetry, output constancy and enhanced dynamic wedge field verification measurements. Materials and Methods: A Varian C-series accelerator with an SAD of 100 cm was used along with PortalVisionTM Dosimetry Research Mode experimental software. The EPID was calibrated according to the equipment manufacturer's specifications. For output constancy check and flatness and symmetry checks, a portal image was taken at dmax using a 20 x 20 cm2 field for both 6 MV and 18 MV photons. Dose measurements were made at five a priori locations in the image relative to the central axis (one at the central axis, two at either end of the in-plane axis and two at either end of the cross-plane axis). For both output constancy and flatness/symmetry checks, the portal imaging system was operating in normal clinical mode and special dosimetry mode. The central axis value was taken as an average of an array of pixels. The output constancy estimation was established by comparing central axis values over five measurements. Verification of the enhanced dynamic wedge fields occurred with the portal imaging system operating in special dosimetry research mode. Line images were taken at dmax using 15 deg. , 30 deg. , 45 deg. and 60 deg. wedge settings with a field size of 20 x 20 cm2 for both energies. An original delivered wedge profile was established as a reference image and five other images were taken and compared with least-squares analysis to quantify the differences. In order to perform routine LINAC quality checks using a portal imaging system, it must be quick and convenient. However, commercial implementations currently lack

  1. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed and simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.

  2. Microfracturing and new tools improve formation analysis

    McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))


    This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.


    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  4. Sociology and Systems Analysis

    Becker, H.A.


    The Management and Technology (MMT) Area of IIASA organizes, from time to time, seminars on topics that are of interest in connection with the work at the Institute. Since MMT sees the importance of investigating the broader management aspects when using systems analytical tools, it was of great interest to have Professor Henk Becker from the University of Utrecht give a seminar on "Sociology of Systems Analysis". As his presentation at this seminar should be of interest to a wider audie...

  5. Tool, weapon, or white elephant? A realist analysis of the five phases of a twenty-year programme of occupational health information system implementation in the health sector

    Spiegel Jerry M


    Full Text Available Abstract Background Although information systems (IS have been extensively applied in the health sector worldwide, few initiatives have addressed the health and safety of health workers, a group acknowledged to be at high risk of injury and illness, as well as in great shortage globally, particularly in low and middle-income countries. Methods Adapting a context-mechanism-outcome case study design, we analyze our team’s own experience over two decades to address this gap: in two different Canadian provinces; and two distinct South African settings. Applying a realist analysis within an adapted structuration theory framing sensitive to power relations, we explore contextual (socio-political and technological characteristics and mechanisms affecting outcomes at micro, meso and macro levels. Results Technological limitations hindered IS usefulness in the initial Canadian locale, while staffing inadequacies amid pronounced power imbalances affecting governance restricted IS usefulness in the subsequent Canadian application. Implementation in South Africa highlighted the special care needed to address power dynamics regarding both worker-employer relations (relevant to all occupational health settings and North–south imbalances (common to all international interactions. Researchers, managers and front-line workers all view IS implementation differently; relationships amongst the workplace parties and between community and academic partners have been pivotal in determining outcome in all circumstances. Capacity building and applying creative commons and open source solutions are showing promise, as is international collaboration. Conclusions There is worldwide consensus on the need for IS use to protect the health workforce. However, IS implementation is a resource-intensive undertaking; regardless of how carefully designed the software, contextual factors and the mechanisms adopted to address these are critical to mitigate threats and achieve

  6. Applying AI tools to operational space environmental analysis

    Krajnak, Mike; Jesse, Lisa; Mucks, John


    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  7. Comparative guide to emerging diagnostic tools for large commercial HVAC systems

    Friedman, Hannah; Piette, Mary Ann


    This guide compares emerging diagnostic software tools that aid detection and diagnosis of operational problems for large HVAC systems. We have evaluated six tools for use with energy management control system (EMCS) or other monitoring data. The diagnostic tools summarize relevant performance metrics, display plots for manual analysis, and perform automated diagnostic procedures. Our comparative analysis presents nine summary tables with supporting explanatory text and includes sample diagnostic screens for each tool.

  8. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    SHIPeizhi; LISanli


    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  9. 带电作业工器具自动管理系统应用分析%Application Analysis of Live Line Tool Automatic Management System

    曹国文; 蒋标


    In view of complicated formalities of tools receiving and returning, long time-consuming, low work efficiency in the live line tools storehouse in Bayannur Electric Power Bureau, the live line tool automatic management system was adopted. RFID system simplified the the procedure of tools receipt and approval. As implementing, the staff need to take the tools with radio frequency tags through the the doors installed with radio frequency device instruments. The system will record the information of the staff, the time, the name of tools taking out (taking in), and the numbers of the tools, and upload and store automatically the information to live working instruments warehouse computer. The staff can inquire information in this computer and in the office computer through private network, which can not only save working time, improve work efficiency, but also guarantee the safety of the instruments use.%针对内蒙古巴彦淖尔电业局带电作业工具库内工器具领用、归还手续繁琐、耗时长,工作效率低的情况,在该局使用了带电作业工器具自动管理系统(RFID)。该系统简化了带电作业工器具出、入库的流程,操作时,工作人员只需携带贴有射频标签的工器具从装有射频装置的大门通过,该系统就会记录领取工器具的工作人员、时间、带出(回)的工器具名称以及数量信息,并将信息上传至带电作业工器具库房的计算机中,自动保存,工作人员可在该计算机中查询借出、归还、库存信息,也可在办公室计算机中通过专网查询当前工器具信息,无需人工登记,不仅节约了工作时间,提高了工作效率,而且确保了工器具的使用安全。

  10. Setup Analysis: Combining SMED with Other Tools

    Stadnicka Dorota


    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  11. Medical decision making tools: Bayesian analysis and ROC analysis

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  12. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Kurovskaya Yulia G.


    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  13. Software tools for microprocessor based systems

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  14. Cryogenic Propellant Feed System Analytical Tool Development

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.


    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  15. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  16. Standardised risk analysis as a communication tool

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  17. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)


    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  18. Automated Steel Cleanliness Analysis Tool (ASCAT)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  19. Software Tools to Support the Assessment of System Health

    Melcher, Kevin J.


    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  20. Analysis and processing tools for nuclear trade related data

    This paper describes the development of a system used by the Nuclear Trade Analysis Unit of the Department of Safeguards for handling, processing, analyzing, reporting and storing nuclear trade related data. The data handling and analysis part of the system is already functional, but several additional features are being added to optimize its use. The aim is to develop the system in a manner that actively contributes to the management of the Department's overall knowledge and supports the departmental State evaluation process. Much of the data originates from primary sources and comes in many different formats and languages. It also comes with diverse security needs. The design of the system has to meet the special challenges set by the large volume and different types of data that needs to be handled in a secure and reliable environment. Data is stored in a form appropriate for access and analysis in both structured and unstructured formats. The structured data is entered into a database (knowledge base) called the Procurement Tracking System (PTS). PTS allows effective linking, visualization and analysis of new data with that already included in the system. The unstructured data is stored in text searchable folders (information base) equipped with indexing and search capabilities. Several other tools are linked to the system including a visual analysis tool for structured information and a system for visualizing unstructured data. All of which are designed to help the analyst locate the specific information required amongst a myriad of unrelated information. This paper describes the system's concept, design and evolution - highlighting its special features and capabilities, which include the need to standardize the data collection, entry and analysis processes. All this enables the analyst to approach tasks consistently and in a manner that both enhances teamwork and leads to the development of an institutional memory related to cover trade activities that can be

  1. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  2. Discovery and New Frontiers Project Budget Analysis Tool

    Newhouse, Marilyn E.


    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  3. Analysis of machining and machine tools

    Liang, Steven Y


    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  4. Cost analysis and estimating tools and techniques

    Nussbaum, Daniel


    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  5. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Mihaela-Laura IVAN


    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.


    Chun-Hsiao Wu


    Full Text Available In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter, the target data sets, and appropriate social sensors for analysis. By adopting parameter templates, one can quickly apply the experience of other experts at the beginning of a new case or even create one’s own templates. We have also modularized the analysis tools into two social sensors: Language Sensor and Text Sensor. A user evaluation was conducted and the results showed that usefulness, modularity, reusability, and manageability of the system were all very positive. The results also show that this tool can greatly reduce the time needed to perform data analysis, solve the problems encountered in traditional analysis process, and obtained useful results. The experimental results reveal that the concept of social sensor and the proposed system design are useful for big data analysis of social media.

  7. Keel A Data Mining Tool: Analysis With Genetic

    Ms. Pooja Mittal


    Full Text Available This work is related to the KEEL (Knowledge Extraction basedon Evolutionary Learning tool, an open source software thatsupports data management and provides a platform for theanalysis of evolutionary learning for Data Mining problems ofdifferent kinds including as regression, classification,unsupervised learning. It includes a big collection of evolutionarylearning algorithms based on different approaches: Pittsburgh,Michigan. It empowers the user to perform complete analysis ofany genetic fuzzy system in comparison to existing ones, with astatistical test module for comparison.

  8. Emulation tool of dynamic systems via internet

    Daniel Ruiz Olaya


    Full Text Available The experimentation laboratories for the studies of control system courses can become expensive, either in its acquisition, operation or maintenance. An alternative resource have been the remote laboratories. However, not always is possible to get complex systems. A solution to this matter are the remote emulation laboratories. In this paper describes the development of a Web application for the emulation of dynamic systems using a free-distribution software tool of rapid control prototyping based on Linux/RTAI. This application is focused especially for the experimentation with dynamic systems that are not available easily in a laboratory where the model have been configured by the user. The design of the front-end and the back-end are presented. The latency times of the real-time operating system and the ability of the system to reproduce similar signals to a real system from an emulated model were verified. An example, to test the functionality of the application the model of an evaporator was used. One of the advantages of the application is the work methodology which is based on the development of blocks in Scicos. This allows the user to reuse those parameters and the code that was implemented to build a block on the Scicos toolbox with the Linux/RTAI/ScicosLab environment. Moreover, only a web-browser and the Java Virtual Machine are required.

  9. Built Environment Analysis Tool: April 2013

    Porter, C.


    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  10. Multi-Spacecraft Analysis with Generic Visualization Tools

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.


    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.