WorldWideScience

Sample records for analysis system tool

  1. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  2. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  3. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  4. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  5. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  6. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  7. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  8. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...

  9. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  10. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  11. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  12. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  13. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  14. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    Science.gov (United States)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  15. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  16. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  17. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    regression representations of more complex M&S tools.4 C2WindTunnel. C2 WindTunnel is a software test bed developed by George Mason for Command and...Technology Project, U.S. Marine Corps Systems Command.    5  Roth , Karen; Barrett, Shelby. 2009 (July). Command and Control Wind Tunnel Integration

  18. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  19. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  20. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  1. Development of the Expert System Domain Advisor and Analysis Tool

    Science.gov (United States)

    1991-09-01

    analysis. Typical of the current methods in use at this time is the " tarot metric". This method defines a decision rule whose output is whether to go...B - TAROT METRIC B. ::TTRODUCTION The system chart of ESEM, Figure 1, shows the following three risk-based decision points: i. At prolect initiation...34 decisions. B-I 201 PRELIMINARY T" B-I. Evaluais Factan for ES Deyelopsineg FACTORS POSSIBLE VALUE RATINGS TAROT metric (overall suitability) Poor, Fair

  2. Betweenness as a Tool of Vulnerability Analysis of Power System

    Science.gov (United States)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-12-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  3. Benchmarking expert system tools

    Science.gov (United States)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  4. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...

  5. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  6. Cellular barcoding tool for clonal analysis in the hematopoietic system

    NARCIS (Netherlands)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J.; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J. C.; de Haan, Gerald; Bystrykh, Leonid V.

    2010-01-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blott

  7. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  8. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...

  9. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  10. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  11. Child Language Data Exchange System Tools for Clinical Analysis.

    Science.gov (United States)

    MacWhinney, Brian; Fromm, Davida

    2016-05-01

    The Child Language Data Exchange System Project has developed methods for analyzing many aspects of child language development, including grammar, lexicon, discourse, gesture, phonology, and fluency. This article will describe the methods available for each of these six fields, and how they can be used for assessment in the clinical setting.

  12. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  13. Transient analysis of power systems solution techniques, tools and applications

    CERN Document Server

    Martinez-Velasco, J

    2014-01-01

    A comprehensive introduction and up-to-date reference to SiC power semiconductor devices covering topics from material properties to applicationsBased on a number of breakthroughs in SiC material science and fabrication technology in the 1980s and 1990s, the first SiC Schottky barrier diodes (SBDs) were released as commercial products in 2001.  The SiC SBD market has grown significantly since that time, and SBDs are now used in a variety of power systems, particularly switch-mode power supplies and motor controls.  SiC power MOSFETs entered commercial production in 2011, providing rugged, hig

  14. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  15. Modeling and Analysis Tools for Linear and Nonlinear Mechanical Systems Subjected to Extreme Impulsive Loading

    Science.gov (United States)

    2015-03-23

    TITLE AND SUBTITLE Modeling and Analysis Tools for Linear and Nonlinear Mechanical Systems Subjected to Extreme Impulsive Loading 5a. CONTRACT...and Nonlinear Mechanical Systems Subjected to Extreme Impulsive Loading AFOSR GRANT FA9550-11-1-0108 A. J. Dick Department of Mechanical Engineering...For linear systems, the convolution integral of a system’s impulse response and the applied force results in the response of the system. This

  16. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  17. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  18. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  19. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    DEFF Research Database (Denmark)

    Clavreul, Julie

    to be modelled rather than monitored as in classical LCA (e.g. landfilling or the application of processed waste on agricultural land). Therefore LCA-tools are needed which specifically address these issues and enable practitioners to model properly their systems. In this thesis several pieces of work...... are presented. First a review was carried out on all LCA studies of waste management systems published before mid-2012. This provided a global overview of the technologies and waste fractions which have attracted focus within LCA while enabling an analysis of methodological tendencies, the use of tools...... and databases and the application of uncertainty analysis methods. The major outcome of this thesis was the development of a new LCA model, called EASETECH, building on the experience with previous LCA-tools, in particular the EASEWASTE model. Before the actual implementation phase, a design phase involved...

  20. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Energy Technology Data Exchange (ETDEWEB)

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering

    2006-06-15

    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  1. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  2. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    Science.gov (United States)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  3. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    Science.gov (United States)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  4. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    Science.gov (United States)

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark

    2011-01-01

    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  5. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  6. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  7. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  8. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  9. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one o

  10. The integrated microbial genomes (IMG) system in 2007: datacontent and analysis tool extensions

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Szeto, Ernest; Palaniappan, Krishna; Grechkin, Yuri; Chu, Ken; Chen, I-Min A.; Dubchak, Inna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2007-08-01

    The Integrated Microbial Genomes (IMG) system is a data management, analysis and annotation platform for all publicly available genomes. IMG contains both draft and complete JGI microbial genomes integrated with all other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and annotating genomes, genes and functions, individually or in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through quarterly releases. IMG is provided by the DOE-Joint Genome Institute (JGI) and is available from http://img.jgi.doe.gov.

  11. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities. The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.

  12. Instantaneous Purified Orbit: A New Tool for Analysis of Nonstationary Vibration of Rotor System

    Directory of Open Access Journals (Sweden)

    Shi Dongfeng

    2001-01-01

    Full Text Available In some circumstances, vibration signals of large rotating machinery possess time-varying characteristics to some extent. Traditional diagnosis methods, such as FFT spectrum and orbit diagram, are confronted with a huge challenge to deal with this problem. This work aims at studying the four intrinsic drawbacks of conventional vibration signal processing method and instantaneous purified orbit (IPO on the basis of improved Fourier spectrum (IFS to analyze nonstationary vibration. On account of integration, the benefits of short period Fourier transform (SPFT and regular holospectrum, this method can intuitively reflect vibration characteristics of’a rotor system by means of parameter analysis for corresponding frequency ellipses. Practical examples, such as transient vibration in run-up stages and bistable condition of rotor show that IPO is a powerful tool for diagnosis and analysis of the vibration behavior of rotor systems.

  13. Dynamic simulation tools for the analysis and optimization of novel collection, filtration and sample preparation systems

    Energy Technology Data Exchange (ETDEWEB)

    Clague, D; Weisgraber, T; Rockway, J; McBride, K

    2006-02-12

    The focus of research effort described here is to develop novel simulation tools to address design and optimization needs in the general class of problems that involve species and fluid (liquid and gas phases) transport through sieving media. This was primarily motivated by the heightened attention on Chem/Bio early detection systems, which among other needs, have a need for high efficiency filtration, collection and sample preparation systems. Hence, the said goal was to develop the computational analysis tools necessary to optimize these critical operations. This new capability is designed to characterize system efficiencies based on the details of the microstructure and environmental effects. To accomplish this, new lattice Boltzmann simulation capabilities where developed to include detailed microstructure descriptions, the relevant surface forces that mediate species capture and release, and temperature effects for both liquid and gas phase systems. While developing the capability, actual demonstration and model systems (and subsystems) of national and programmatic interest were targeted to demonstrate the capability. As a result, where possible, experimental verification of the computational capability was performed either directly using Digital Particle Image Velocimetry or published results.

  14. Tav4SB: integrating tools for analysis of kinetic models of biological systems

    Directory of Open Access Journals (Sweden)

    Rybiński Mikołaj

    2012-04-01

    Full Text Available Abstract Background Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. Results The Taverna services for Systems Biology (Tav4SB project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project’s Web page: http://bioputer.mimuw.edu.pl/tav4sb/. Conclusions The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  15. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool.

  16. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    Directory of Open Access Journals (Sweden)

    Mari-Carmen Mochón

    2016-03-01

    Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.

  17. Systems Analysis – a new paradigm and decision support tools for the water framework directive

    Directory of Open Access Journals (Sweden)

    M. Bruen

    2008-05-01

    Full Text Available In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE and the Analytical Hierarchy Process (AHP are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.

  18. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  19. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  20. The Axial Nonlinear Vibration Analysis of Ball-screw about Machine Tool Feeding System

    Institute of Scientific and Technical Information of China (English)

    ZENG Hao-ran; LIU Nian-cong; YANG Jia-rui; CHEN Jian-long; GENG Wei-tao

    2016-01-01

    The forced state of the ball⁃screw of machine tool feeding system is analyzed. The ball⁃screw is simplified as Timoshenko beam and the differential equation of motion for the ball⁃screw is built. To obtain the axial vibration equation, the differential equation of motion is simplified using the assumed mode method. Axial vibration equation is in form of Duffing equation and has the characteristics of nonlinearity. The numerical simulation of Duffing equation is proceeded by MATLAB/Simulink. The effect of screw length, exciting force and damping coefficient are researched, and the axial vibration phase track diagram and Poincare section are obtained. The stability and period of the axial vibration are analyzed. The limit cycle of phase track diagram is enclosed. Axial vibration has two type⁃center singularity distributions on both sides of the origin. The singularity attracts vibration to reach a stable state, and Poincare section shows that axial vibration appears chaotic motion and quasi periodic motion or periodic motion. Singularity position changes with the vibration system parameters, while the distribution doesn′t change. The period of the vibration is enhanced with increasing frequency and damping coefficient. Test of the feeding system ball⁃screw axial vibration exists chaos movement. This paper provides a certain theoretical basis for the dynamic characteristic analysis of machine feeding system ball⁃screw and optimization of structural parameters.

  1. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    Science.gov (United States)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to

  2. Analysis tools for simulation of hybrid systems; Herramientas de analisis para simulacion de sistemas hibridos

    Energy Technology Data Exchange (ETDEWEB)

    Guillen S, Omar; Mejia N, Fortino [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2005-07-01

    In order to facilitate and to simplify the development and analysis of a Hybrid System in reference to its design, construction, operation and maintenance, it turns out optimal to carry out the simulation of this one by means of software, with which a significant reduction in the investment costs is obtained. Given the mix of technology of electrical generation which is involved in a hybrid system, it is very important to have a tool integrated with specialized packages of calculation (software), that allow to carry out the simulation tasks of the operational functioning of these systems. Combined with the former, one must not fail to consider the operation characteristics, the facilities of the user, the clarity in the obtained results and the possibility of its validation with respect to prototypes orchestrated in field. Equally, it is necessary to consider the identification of tasks involved in relation to the place of installation of this electrification technology. At the moment, the hybrid systems technology still is in a stage of development in the international level, and exist important limitations as far as the methodology availability and engineering tools for the optimum design of these systems. With the development of this paper, it is intended to contribute to the advance of the technology and to count on own tools to solve the described series of problems. In this article are described the activities that more impact have in the design and development of hybrid systems, as well as the identification of variables, basic characteristics and form of validation of tools in the integration of a methodology for the simulation of these systems, facilitating their design and development. [Spanish] Para facilitar y simplificar el desarrollo y analisis de un Sistema Hibrido en lo que refiere a su diseno, construccion, operacion y mantenimiento, resulta optimo efectuar la simulacion de este por medio de un software, con lo que se obtiene una reduccien

  3. Experimental Analysis of Browser based Novel Anti-Phishing System Tool at Educational Level

    Directory of Open Access Journals (Sweden)

    Rajendra Gupta

    2016-02-01

    Full Text Available In the phishing attack, the user sends their confidential information on mimic websites and face the financial problem, so the user should be informed immediately about the visiting website. According to the Third Quarter Phishing Activity Trends Report, there are 55,282 new phishing websites have been detected in the month of July 2014. To solve the phishing problem, a browser based add-on system may be one of the best solution to aware the user about the website type. In this paper, a novel browser based add-on system is proposed and compared its performance with the existing antiphishing tools. The proposed anti-phishing tool ‘ePhish’ is compared with the existing browser based antiphishing toolbars. All the anti-phishing tools have been installed in computer systems at an autonomous college to check their performance. The obtained result shows that if the task is divided into a group of systems, it can give better results. For different phishing features, the add-on system tool show around 97 percentage successful results at different case conditions. The current study would be very helpful to countermeasure the phishing attach and the proposed system is able to protect the user by phishing attacks. Since the system tool is capable of handling and managing the phishing website details, so it would be helpful to identify the category of the websites.

  4. The analysis of the functionality of modern systems, methods and scheduling tools

    Directory of Open Access Journals (Sweden)

    Abramov Ivan

    2016-01-01

    Full Text Available Calendar planning is a key tool for efficient management applied in many industries: power, oil & gas, metallurgy, and construction. As a result of the growing complexity of projects and arising need for improvement of their efficiency, a large number of software tools for high-quality calendar planning appear. Construction companies are facing the challenge of optimum selection of such tools (programs for distribution of limited resources in time.The article provides analysis of the main software packages and their capabilities enabling improvement of project implementation efficiency.

  5. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    Economic analysis, allied to process systems engineering tools, can provide useful insights about process techno-economic feasibility. More interestingly, rather than being used to evaluate specific process conditions, this techno-economic analysis can be turned upside down to achieve target valu...

  6. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  7. A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study

    Science.gov (United States)

    Mukhopadhyay, Vivek

    2003-01-01

    An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.

  8. Mathematical and computer tools of discrete dynamic modeling and analysis of complex systems in control loop

    CERN Document Server

    Bagdasaryan, Armen

    2008-01-01

    We present a method of discrete modeling and analysis of multilevel dynamics of complex large-scale hierarchical dynamic systems subject to external dynamic control mechanism. Architectural model of information system supporting simulation and analysis of dynamic processes and development scenarios (strategies) of complex large-scale hierarchical systems is also proposed.

  9. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  10. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  11. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extrac...... and analyze web data in the process of investigating substantive questions......., analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  12. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  13. Energy-Sustainable Framework and Performance Analysis of Power Scheme for Operating Systems: A Tool

    Directory of Open Access Journals (Sweden)

    G. Singh

    2012-12-01

    Full Text Available Recently, an Information and Communications Technology (ICT devices has become more user-friendly, which raised the problem of power dissipation across the globe and computer systems are one among them. This emerging issue of power dissipation has imposed a very significant issue on the system and software design. The concept of ‘green computing’ gaining popularity and is being considered as one of the most promising technology by the designers of Information Technology (IT industry, which demonstrate the environmentally responsible way to reduce the power consumption and maximize the energy efficiency. In this paper, we have proposed an energy sustainable framework of the power schemes for operating systems to reduce the power consumption by computer systems and presented a Green Power tool (GP tool. This tool is designed using JAVA technology, which requires least configuration to make a decision for reducing the power consumption and proposed Swift mode algorithm, allows users to input the working time of their choice then after the end of time algorithm starts detection of human activity on the computer system. We also compared the Swift mode algorithm with existing power scheme in the operating system that provides up to 66% of the power saving. Finally, we have profiled the proposed framework to analyze the memory and Central Processing Unit (CPU performance, which demonstrated that there is no memory leakage or CPU degradation problem and framework’s behavior remain constant under various overhead scenarios of the memory as well as CPU. The proposed framework requires 3–7 MB memory space during its execution.

  14. Motion analysis systems as optimization training tools in combat sports and martial arts

    Directory of Open Access Journals (Sweden)

    Ewa Polak

    2016-01-01

    Full Text Available Introduction: Over the past years, a few review papers about possibilities of using motion analysis systems in sport were published, but there are no articles that discuss this problem in the field of combat sports and martial arts. Aim: This study presents the diversity of contemporary motion analysis systems both, those that are used in scientific research, as well as those that can be applied in daily work of coaches and athletes in combat sports and martial arts. An additional aim is the indication of example applications in scientific research and range of applications in optimizing the training process. It presents a brief description of each type of systems that are currently used in sport, specific examples of systems and the main advantages and disadvantages of using them. The presentation and discussion takes place in the following sections: motion analysis utility for combat sports and martial arts, systems using digital video and systems using markers, sensors or transmitters. Conclusions: Not all types of motion analysis systems used in sport are suitable for combat sports and martial arts. Scientific studies conducted so far showed the usefulness of video-based, optical and electromechanical systems. The use of research results made with complex motion analysis systems, or made with simple systems, local application and immediate visualization is important for the preparation of training and its optimization. It may lead to technical and tactical improvement in athletes as well as the prevention of injuries in combat sports and martial arts.

  15. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  16. Bifurcation Tools for Flight Dynamics Analysis and Control System Design Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Modern bifurcation analysis methods have been proposed for investigating flight dynamics and control system design in highly nonlinear regimes and also for the...

  17. Workload Characterization an Essential Step in Computer Systems Performance Analysis - Methodology and Tools

    OpenAIRE

    CHEVERESAN, R.T.; HOLBAN, S.

    2009-01-01

    Computer system performance is a very complex process in which the hardware and software manufacturers invest important human and financial resources. Workload characterization represents an essential component of performance analysis. This paper presents a trace based methodology for software applications evaluation. It introduces a new analysis concept designed to significantly ease this process and it presents a set of experimental data collected using the new analysis structure on a repre...

  18. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  19. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  20. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  1. MOSES – A modelling tool for the analysis of scenarios of the European electricity supply system

    Directory of Open Access Journals (Sweden)

    Agert C.

    2012-10-01

    Full Text Available Recent studies have shown that a transition of the current power supply system in Europe to a system almost entirely based on fluctuating Renewable Energy Sources (RES by mid-century is possible. However, most of these scenarios require a significant amount of back-up power capacities to ensure the security of electricity supply. This would imply high additional investments and operating costs. Hence, alternative options should be investigated first. Here we present a first outlook of our simulation model MOSES which will be able to analyse different target states of the European electricity system in 2050. In this model long-term meteorological data series are used to optimise the capacity mix of RES in Europe. One of the main elements of our tool is a simplified electricity network. In addition, alternative options for reduction of additional back-up power like the expansion of the transmission grid, the use of demand-side management and/or the installation of over-capacities will be implemented. The results will be used to provide scientifically proven recommendations to policy makers for a reliable energy supply system in Europe based on Renewable Energy Sources.

  2. ANALYSIS OF ELECTROMAGNETIC PROCESS IN COMBINED INDUCTOR SYSTEMS, AS TOOLS FOR STRAIGHTENING OF MODERN CAR

    Directory of Open Access Journals (Sweden)

    D.O. Voloncevich

    2015-08-01

    Full Text Available The field in combined inductor systems exists only over workpiece, it occurs at only a low-frequency circular magnetic field. In this paper electromagnetic processes in tools of the magnetic pulse attraction (the combined inductor systems are analyzed. Investigation based on numerical estimations, using previously obtained analytical relations for the excited fields and forces. Calculations are necessary for the successful implementation of straightening metal coatings bodies car. A distribution of relative intensity of magnetic fields to surfaces of sheet workpiece in the centre of a working zone is obtained. The distribution of amplitude tangential component intensity of resulting magnetic field of the surface of sheet metal along the centre of a rectangular coil is received. Feature of the combined induction system is non-uniformly distributed forces of attraction in the work area. The results showed that the calculated working area on the outer surface of sheet workpiece magnetic field strength will be less than 5% of the field strength of a circular low-frequency source. Calculations have shown the effectiveness of the proposed instrument magnetic pulse straightening metal coating bodies car. The actual amplitude of distributed attraction forces are ~ 7.7 MPa.

  3. Material Flow Analysis as a Tool to improve Waste Management Systems: The Case of Austria.

    Science.gov (United States)

    Allesch, Astrid; Brunner, Paul H

    2017-01-03

    This paper demonstrates the power of material flow analysis (MFA) for designing waste management (WM) systems and for supporting decisions with regards to given environmental and resource goals. Based on a comprehensive case study of a nationwide WM-system, advantages and drawbacks of a mass balance approach are discussed. Using the software STAN, a material flow system comprising all relevant inputs, stocks and outputs of wastes, products, residues, and emissions is established and quantified. Material balances on the level of goods and selected substances (C, Cd, Cr, Cu, Fe, Hg, N, Ni, P, Pb, Zn) are developed to characterize this WM-system. The MFA results serve well as a base for further assessments. Based on given goals, stakeholders engaged in this study selected the following seven criteria for evaluating their WM-system: (i) waste input into the system, (ii) export of waste (iii) gaseous emissions from waste treatment plants, (iv) long-term gaseous and liquid emissions from landfills, (v) waste being recycled, (vi) waste for energy recovery, (vii) total waste landfilled. By scenario analysis, strengths and weaknesses of different measures were identified. The results reveal the benefits of a mass balance approach due to redundancy, data consistency, and transparency for optimization, design, and decision making in WM.

  4. Tools for Distributed Systems Monitoring

    Directory of Open Access Journals (Sweden)

    Kufel Łukasz

    2016-11-01

    Full Text Available The management of distributed systems infrastructure requires dedicated set of tools. The one tool that helps visualize current operational state of all systems and notify when failure occurs is available within monitoring solution. This paper provides an overview of monitoring approaches for gathering data from distributed systems and what are the major factors to consider when choosing a monitoring solution. Finally we discuss the tools currently available on the market.

  5. Workload Characterization an Essential Step in Computer Systems Performance Analysis - Methodology and Tools

    Directory of Open Access Journals (Sweden)

    CHEVERESAN, R.T.

    2009-10-01

    Full Text Available Computer system performance is a very complex process in which the hardware and software manufacturers invest important human and financial resources. Workload characterization represents an essential component of performance analysis. This paper presents a trace based methodology for software applications evaluation. It introduces a new analysis concept designed to significantly ease this process and it presents a set of experimental data collected using the new analysis structure on a representative set of scientific and commercial applications. Several important conclusions are drawn regarding workload characteristics, classifications and runtime behavior. This type of data is used by the computer architects in their efforts to maximize the performance of the hardware platforms these applications are going to execute on.

  6. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  7. Shot Planning and Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  8. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    Science.gov (United States)

    Lausberg, Hedda; Sloetjes, Han

    2016-09-01

    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.

  9. State-of-the-art Tools and Techniques for Quantitative Modeling and Analysis of Embedded Systems

    DEFF Research Database (Denmark)

    Bozga, Marius; David, Alexandre; Hartmanns, Arnd;

    2012-01-01

    This paper surveys well-established/recent tools and techniques developed for the design of rigorous embedded sys- tems. We will first survey U PPAAL and M ODEST, two tools capable of dealing with both timed and stochastic aspects. Then, we will overview the BIP framework for modular design...

  10. On the use of financial analysis tools for the study of Dst time series in the frame of complex systems

    CERN Document Server

    Potirakis, Stelios M; Balasis, Georgios; Eftaxias, Konstantinos

    2016-01-01

    Technical analysis is considered the oldest, currently omnipresent, method for financial markets analysis, which uses past prices aiming at the possible short-term forecast of future prices. In the frame of complex systems, methods used to quantitatively analyze specific dynamic phenomena are often used to analyze phenomena from other disciplines on the grounds that are governed by similar dynamics. An interesting task is the forecast of a magnetic storm. The hourly Dst is used as a global index for the monitoring of Earth's magnetosphere, which could be either in quiet (normal) or in magnetic storm (pathological) state. This work is the first attempt to apply technical analysis tools on Dst time series, aiming at the identification of indications which could be used for the study of the temporal evolution of Earth's magnetosphere state. We focus on the analysis of Dst time series around the occurrence of magnetic storms, discussing the possible use of the resulting information in the frame of multidisciplina...

  11. Udder Hygiene Analysis tool

    OpenAIRE

    2013-01-01

    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  12. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  13. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  14. Interval Analysis: A New Tool for the Characterization of an Epoxy-Amine/Aluminum System

    Directory of Open Access Journals (Sweden)

    Mickaël Pomes-Hadda

    2015-04-01

    Full Text Available Epoxy-amine/aluminum chemical systems or sandwich structures with and without interphase formation are prepared using two different curing cycles and are characterized with dielectric spectroscopy. The sample without interphase formation is obtained when the reaction between epoxy and amine groups is favored, which occurs at high temperature. The interphase formation results from the reaction between the amine group and aluminum surface at room temperature. Dielectric spectra are fitted using the Set Inversion Via Interval Analysis (SIVIA algorithm applied to DiElectric spectroscopy algorithm (SADE developed using the method of intervals analysis. A new approach is implemented using a sum of Debye relaxations to optimize and guarantee the fitting. The results achieved show a distribution of relaxation times, which always take place at the same time as demonstrated. In this study, five Debye relaxations were found which fit the β-relaxation with our model. Finally, we showed that the more intensive of our five β -relaxations follows the Arrhenius law.

  15. AN ANALYSIS OF THE ELECTROMAGNETIC PROCESSES IN THE INDUCTOR SYSTEMTOOL OF THE STRAIGHTENING OF CAR BODIES

    Directory of Open Access Journals (Sweden)

    Yu. V. Batygin

    2015-04-01

    Full Text Available Introduction. One of the promising directions of electromagnetic forming (EMF is a contactless magnetic-pulse straightening of the automobile bodies. The efficiency and the quality of the straightening depend on design and operating principle of the straightening tool. In the modern technique of EMF a large number of the tools - inductor systems (IS is used in different configurations with uneven distribution forces on the treatment object that in turn does not meet the needs of the effective process of straightening. There appears the urgent necessity to create IS with high uniformity of the induced field and a high concentration of attracting forces in the working area of the tool. The most effective IS are the Inductor Systems with an Attracting Screen (ISAS. One of the most important considerations when choosing a particular design ISAS is the study of the electrodynamics processes with definition of excited loads. The nature and the course of the electrodynamics processes in accordance with design features determine the effectiveness and the efficiency of the ISAS. Therefore, in ISAS an additional coil for the concentration of the attracting forces in the working area should be entered. Purpose. The numerical analysis of the induced fields and the currents in the experimental models of ISAS with an additional coil was made. Methodology. In the idealization of the «extremely low» frequencies of existing fields, there were received rated dependences for density of the induced currents and distributed attracting force in ISAS and the external additional coil, through the use of the calculated model in the cylindrical coordinate system. Results. Insertion of the additional coil placed over the accessory screen allows to concentrate and increase the amplitude of the attracting forces in the central part of the working area of the inductor system. Practical value. 1. Numerical analysis of fields and currents in experimental models of Induction

  16. Understanding Earthquake Fault Systems Using QuakeSim Analysis and Data Assimilation Tools

    Science.gov (United States)

    Donnellan, Andrea; Parker, Jay; Glasscoe, Margaret; Granat, Robert; Rundle, John; McLeod, Dennis; Al-Ghanmi, Rami; Grant, Lisa

    2008-01-01

    We are using the QuakeSim environment to model interacting fault systems. One goal of QuakeSim is to prepare for the large volumes of data that spaceborne missions such as DESDynI will produce. QuakeSim has the ability to ingest distributed heterogenous data in the form of InSAR, GPS, seismicity, and fault data into various earthquake modeling applications, automating the analysis when possible. Virtual California simulates interacting faults in California. We can compare output from long time history Virtual California runs with the current state of strain and the strain history in California. In addition to spaceborne data we will begin assimilating data from UAVSAR airborne flights over the San Francisco Bay Area, the Transverse Ranges, and the Salton Trough. Results of the models are important for understanding future earthquake risk and for providing decision support following earthquakes. Improved models require this sensor web of different data sources, and a modeling environment for understanding the combined data.

  17. OCAM - A CELSS modeling tool: Description and results. [Object-oriented Controlled Ecological Life Support System Analysis and Modeling

    Science.gov (United States)

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray

    1992-01-01

    Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.

  18. Sandia PUF Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  19. Bottom-Up Energy Analysis System (BUENAS). An international appliance efficiency policy tool

    Energy Technology Data Exchange (ETDEWEB)

    McNeil, M.A.; Letschert, V.E.; De la Rue du Can, S.; Ke, Jing [Lawrence Berkeley National Laboratory LBNL, 1 Cyclotron Rd, Berkeley, CA (United States)

    2013-02-15

    The Bottom-Up Energy Analysis System (BUENAS) calculates potential energy and greenhouse gas emission impacts of efficiency policies for lighting, heating, ventilation, and air conditioning, appliances, and industrial equipment through 2030. The model includes 16 end use categories and covers 11 individual countries plus the European Union. BUENAS is a bottom-up stock accounting model that predicts energy consumption for each type of equipment in each country according to engineering-based estimates of annual unit energy consumption, scaled by projections of equipment stock. Energy demand in each scenario is determined by equipment stock, usage, intensity, and efficiency. When available, BUENAS uses sales forecasts taken from country studies to project equipment stock. Otherwise, BUENAS uses an econometric model of household appliance uptake developed by the authors. Once the business as usual scenario is established, a high-efficiency policy scenario is constructed that includes an improvement in the efficiency of equipment installed in 2015 or later. Policy case efficiency targets represent current 'best practice' and include standards already established in a major economy or well-defined levels known to enjoy a significant market share in a major economy. BUENAS calculates energy savings according to the difference in energy demand in the two scenarios. Greenhouse gas emission mitigation is then calculated using a forecast of electricity carbon factor. We find that mitigation of 1075 mt annual CO2 emissions is possible by 2030 from adopting current best practices of appliance efficiency policies. This represents a 17 % reduction in emissions in the business as usual case in that year.

  20. A Multi-disciplinary Tool for Space Launch Systems Propulsion Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — An accurate predictive capability of coupled fluid-structure interaction in propulsion system is crucial in the development of NASA's new Space Launch System (SLS)....

  1. Passivity and Dissipativity as Design and Analysis Tools for Networked Control Systems

    Science.gov (United States)

    Yu, Han

    2012-01-01

    In this dissertation, several control problems are studied that arise when passive or dissipative systems are interconnected and controlled over a communication network. Since communication networks can impact the systems' stability and performance, there is a need to extend the results on control of passive or dissipative systems to networked…

  2. Network Analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish Water Management System.

    Science.gov (United States)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-03-30

    New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable patterns of growth and development. This study aims at using Network Analysis (NA) to move such "ecosystem perspective" from theory into practice. A Danish municipal Water Management System (WMS) is used as case study to test the NA methodology and to discuss its generic applicability. We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices of system-level performance for seven different network configurations illustrating past conditions (2004-2008) and future scenarios (2015 and 2020). We also computed the same indices for other 24 human systems and for 12 ecosystems, by using information from the existing scientific literature on NA. The comparison of these results reveals that the WMS is similar to the other human systems and that human systems generally differ from ecosystems. The WMS is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future performance of the WMS are marginal. We argue that future interventions should create alternative pathways for reusing rainwater within the WMS, increasing its potential to withstand the occurrence of flooding. We discuss advantages, limitations, and general applicability of NA as a tool for assessing environmental sustainability in human systems.

  3. Real-time systems design and analysis tools for the practitioner

    CERN Document Server

    Laplante, Phillip A

    2012-01-01

    An important resource, this book offers an introduction and overview of real-time systems: systems where timeliness is a crucial part of the correctness of the system. It contains a pragmatic overview of key topics (computer architecture and organization, operating systems, software engineering, programming languages, and compiler theory) from the perspective of the real-time systems designer and is organized into chapters that are essentially self-contained. In addition, each chapter contains both basic and more challenging exercises that will help the reader to confront actual problems.

  4. The Isotropic Fractionator as a Tool for Quantitative Analysis in Central Nervous System Diseases.

    Science.gov (United States)

    Repetto, Ivan E; Monti, Riccardo; Tropiano, Marta; Tomasi, Simone; Arbini, Alessia; Andrade-Moraes, Carlos-Humberto; Lent, Roberto; Vercelli, Alessandro

    2016-01-01

    One major aim in quantitative and translational neuroscience is to achieve a precise and fast neuronal counting method to work on high throughput scale to obtain reliable results. Here, we tested the isotropic fractionator (IF) method for evaluating neuronal and non-neuronal cell loss in different models of central nervous system (CNS) pathologies. Sprague-Dawley rats underwent: (i) ischemic brain damage; (ii) intraperitoneal injection with kainic acid (KA) to induce epileptic seizures; and (iii) monolateral striatal injection with quinolinic acid (QA) mimicking human Huntington's disease. All specimens were processed for IF method and cell loss assessed. Hippocampus from KA-treated rats and striatum from QA-treated rats were carefully dissected using a dissection microscope and a rat brain matrix. Ischemic rat brains slices were first processed for TTC staining and then for IF. In the ischemic group the cell loss corresponded to the neuronal loss suggesting that hypoxia primarily affects neurons. Combining IF with TTC staining we could correlate the volume of lesion to the neuronal loss; by IF, we could assess that neuronal loss also occurs contralaterally to the ischemic side. In the epileptic group we observed a reduction of neuronal cells in treated rats, but also evaluated the changes in the number of non-neuronal cells in response to the hippocampal damage. In the QA model, there was a robust reduction of neuronal cells on ipsilateral striatum. This neuronal cell loss was not related to a drastic change in the total number of cells, being overcome by the increase in non-neuronal cells, thus suggesting that excitotoxic damage in the striatum strongly activates inflammation and glial proliferation. We concluded that the IF method could represent a simple and reliable quantitative technique to evaluate the effects of experimental lesions mimicking human diseases, and to consider the neuroprotective/anti-inflammatory effects of different treatments in the whole

  5. Wavelet Transform - A New Tool for Analysis of Harmonics in Power Systems

    Directory of Open Access Journals (Sweden)

    Osipov D. S.

    2016-01-01

    Full Text Available This paper presents a review of application of discrete wavelet transform in power system transient analyses. Based on the discrete time domain approximation, the system components such as resistor and inductor are modeled respectively in discrete wavelet domain for the purpose of transient and steady state analyses. Numerical results for transient inductor model can be implemented by any kind of power system including normal and emergency operating modes.

  6. Bifurcation Tools for Flight Dynamics Analysis and Control System Design Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the project is the development of a computational package for bifurcation analysis and advanced flight control of aircraft. The development of...

  7. MacCAD (Macintosh Computer Aided Design), Computer Aided Design Tool for System Analysis.

    Science.gov (United States)

    1987-12-01

    8217 . - -. ;. . : • . , . . . . . ., . ... ..- ... . -. .... , , .. .. , ..: Change from System Group Cancel Forward Path Feedback Path (stmMotor Circuit I (elocity ’: Lstm Loa~d Type of feedback - N simplification form ,Geq [I Add

  8. Advanced Numerical Tools for Design and Analysis of In-Space, Valve and Feed Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In-space valves for the main fuel and oxidizer feed systems are required to provide precise control, wide throttling range and handle rapid on-off control. These...

  9. Dynamic Analysis of Condenser Assembly of Automobile Air Conditioning System Using CAE Tools

    Science.gov (United States)

    Singh, M.; Singh, D.; Saini, J. S.

    2013-04-01

    With the automotive air-conditioning industry aiming at higher levels of quality, cost effectiveness and a short time to market, the need for simulation is at an all time high. In the present work, the use of dynamics analysis is proposed in the simulation of the automobile air conditioning condenser assembly for the vibration loads. The condenser assembly has been analyzed using the standard testing conditions. The results revealed that the components of condenser assembly may fail due to resonance in dynamic analysis. Thereafter, the condenser assembly was optimized, resulting in a 2 % reduction in mass.

  10. Analysis techniques for multivariate root loci. [a tool in linear control systems

    Science.gov (United States)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  11. SymGF: a symbolic tool for quantum transport analysis and its application to a double quantum dot system.

    Science.gov (United States)

    Feng, Zimin; Sun, Qing-feng; Wan, Langhui; Guo, Hong

    2011-10-19

    We report the development and an application of a symbolic tool, called SymGF, for analytical derivations of quantum transport properties using the Keldysh nonequilibrium Green's function (NEGF) formalism. The inputs to SymGF are the device Hamiltonian in the second quantized form, the commutation relation of the operators and the truncation rules of the correlators. The outputs of SymGF are the desired NEGF that appear in the transport formula, in terms of the unperturbed Green's function of the device scattering region and its coupling to the device electrodes. For complicated transport analysis involving strong interactions and correlations, SymGF provides significant assistance in analytical derivations. Using this tool, we investigate coherent quantum transport in a double quantum dot system where strong on-site interaction exists in the side-coupled quantum dot. Results obtained by the higher-order approximation and Hartree-Fock approximation are compared. The higher-order approximation reveals Kondo resonance features in the density of states and conductances. Results are compared both qualitatively and quantitatively to the experimental data reported in the literature.

  12. Microfluidic housing system: a useful tool for the analysis of dye-sensitized solar cell components

    Science.gov (United States)

    Sacco, A.; Lamberti, A.; Pugliese, D.; Chiodoni, A.; Shahzad, N.; Bianco, S.; Quaglio, M.; Gazia, R.; Tresso, E.; Pirri, C. F.

    2012-11-01

    In order to understand the behavior of the different dye-sensitized solar cell (DSC) components, an in-situ analysis should give fundamental help but it is impossible to be performed without compromising the integrity of the cell. Our recently proposed novel microfluidic approach for the fabrication of DSCs is based on a reversible sealing of the two transparent electrodes and it allows the easy assembling and disassembling of the cell, making possible an analysis of the components over time. The aim of this work is not to investigate the different degradation mechanisms of a standard DSC: we want to show that, by using a microfluidic architecture, it is possible to perform a non-destructive analysis and to monitor the photoanode and the counter electrode properties during their lifetime. Morphological (field emission scanning electron microscopy), wetting (contact angle), optical (UV-visible spectroscopy) and electrical (current-voltage and electrochemical impedance spectroscopy measurements under standard AM1.5G illumination) characterizations have been performed over a period of three weeks. The results show how the variation of the wetting and morphological properties at the counter electrode and of the dye absorbance at the photoanode are strongly related to the decrease of the cell performances as evidenced by electrical characterization, thus demonstrating the effectiveness of the use of our structure in this kind of studies.

  13. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  14. Design and analysis tool validation

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.

    1981-07-01

    The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

  15. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  16. Geographic Information System and tools of spatial analysis in a pneumococcal vaccine trial

    Directory of Open Access Journals (Sweden)

    Tanskanen Antti

    2012-01-01

    Full Text Available Abstract Background The goal of this Geographic Information System (GIS study was to obtain accurate information on the locations of study subjects, road network and services for research purposes so that the clinical outcomes of interest (e.g., vaccine efficacy, burden of disease, nasopharyngeal colonization and its reduction could be linked and analyzed at a distance from health centers, hospitals, doctors and other important services. The information on locations can be used to investigate more accurate crowdedness, herd immunity and/or transmission patterns. Method A randomized, placebo-controlled, double-blind trial of an 11-valent pneumococcal conjugate vaccine (11PCV was conducted in Bohol Province in central Philippines, from July 2000 to December 2004. We collected the information on the geographic location of the households (N = 13,208 of study subjects. We also collected a total of 1982 locations of health and other services in the six municipalities and a comprehensive GIS data over the road network in the area. Results We calculated the numbers of other study subjects (vaccine and placebo recipients, respectively within the neighborhood of each study subject. We calculated distances to different services and identified the subjects sharing the same services (calculated by distance. This article shows how to collect a complete GIS data set for human to human transmitted vaccine study in developing country settings in an efficient and economical way. Conclusions The collection of geographic locations in intervention trials should become a routine task. The results of public health research may highly depend on spatial relationships among the study subjects and between the study subjects and the environment, both natural and infrastructural. Trial registration number ISRCTN: ISRCTN62323832

  17. Performance analysis of GYRO: a tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Candy, J [General Atomics, PO Box 85608, San Diego, CA 92186-5608 (United States); Carrington, L [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Huck, K [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Kaiser, T [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Mahinthakumar, G [Department of Civil Engineering, North Carolina State University, Raleigh, NC 27695-7908 (United States); Malony, A [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Moore, S [Innovative Computing Laboratory, University of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Reed, D [Renaissance Computing Institute, University of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States); Roth, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Shan, H [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Shende, S [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Snavely, A [San Diego Supercomputer Center, Univ. of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Sreepathi, S [Dept. of Computer Science, North Carolina State Univ., Raleigh, NC 27695-7908 (United States); Wolf, F [Innovative Computing Lab., Univ. of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Zhang, Y [Renaissance Computing Inst., Univ. of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States)

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  18. Performance Analysis of GYRO: A Tool Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  19. Graphical Multiprocessing Analysis Tool (GMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.

    1988-03-01

    The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

  20. User's Guide and Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information

    Science.gov (United States)

    ABSTRACTUser’s Guide & Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information(EPA/601/B-15/001, 2015, 123 pages)Henry Lee II, U.S. EPA, Western Ecology DivisionKatharine Marko, U.S. EPA,...

  1. A Simulation of Energy Storage System for Improving the Power System Stability with Grid-Connected PV using MCA Analysis and LabVIEW Tool

    Directory of Open Access Journals (Sweden)

    Jindrich Stuchly

    2015-01-01

    Full Text Available The large-scale penetration of distributed, Renewable power plants require transfers of large amounts of energy. This, in turn, puts a high strain on the energy delivery infrastructure. In particular, photovoltaic power plants supply energy with high intermittency, possibly affecting the stability of the grid by changing the voltage at the plant connection point. In this contribution, we summarize the main negative effects of selected and real-operated grid connected photovoltaic plant. Thereafter a review of suitable Energy storage systems to mitigate the negative effects has been carried out, compared and evaluated using Multi-criterion analysis. Based on this analysis, data collected at the plant and the grid, are used to design the energy storage systems to support connection of the plant to the grid. The cooperation of these systems is then analysed and evaluated using simulation tools created in LabVIEW for this purpose. The simulation results demonstrate the capability of energy storage system solutions to significantly reduce the negative feedback effects of Photovoltaic Power Plan to the low voltage grid.

  2. A Tool and Methodology for AC-Stability Analysis of Continuous-Time Closed-Loop Systems

    CERN Document Server

    Milev, Momchil

    2011-01-01

    Presented are a methodology and a DFII-based tool for AC-stability analysis of a wide variety of closed-loop continuous-time (operational amplifiers and other linear circuits). The methodology used allows for easy identification and diagnostics of ac-stability problems including not only main-loop effects but also local-instability loops in current mirrors, bias circuits and emitter or source followers without breaking the loop. The results of the analysis are easy to interpret. Estimated phase margin is readily available. Instability nodes and loops along with their respective oscillation frequencies are immediately identified and mapped to the existing circuit nodes thus offering significant advantages compared to traditional "black-box" methods of stability analysis (Transient Overshoot, Bode and Phase margin plots etc.). The tool for AC-Stability analysis is written in SKILL? and is fully integrated in DFII? environment. Its "push-button" graphical user interface (GUI) is easy to use and understand. The t...

  3. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  4. Design Tool for Cryogenic Thermal Insulation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Demko, Jonathan A [ORNL; Fesmire, J. E. [NASA Kennedy Space Center, Kennedy Space Center, Florida; Augustynowicz, S. D. [Sierra Lobo Inc., Kennedy Space Center, Florida

    2008-01-01

    Thermal isolation of low-temperature systems from ambient environments is a constant issue faced by practitioners of cryogenics. For energy-efficient systems and processes to be realized, thermal insulation must be considered as an integrated system, not merely an add-on element. A design tool to determine the performance of insulation systems for comparative trade-off studies of different available material options was developed. The approach is to apply thermal analysis to standard shapes (plane walls, cylinders, spheres) that are relatively simple to characterize with a one-dimensional analytical or numerical model. The user describes the system hot and cold boundary geometry and the operating environment. Basic outputs such as heat load and temperature profiles are determined. The user can select from a built-in insulation material database or input user defined materials. Existing information has been combined with the new experimental thermal conductivity data produced by the Cryogenics Test Laboratory for cryogenic and vacuum environments, including high vacuum, soft vacuum, and no vacuum. Materials in the design tool include multilayer insulation, aerogel blankets, aerogel bulk-fill, foams, powders, composites, and other insulation system constructions. A comparison of the design tool to a specific composite thermal insulation system is given.

  5. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  6. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  7. Flow Injection/Sequential Injection Analysis Systems: Potential Use as Tools for Rapid Liver Diseases Biomarker Study

    Directory of Open Access Journals (Sweden)

    Supaporn Kradtap Hartwell

    2012-01-01

    Full Text Available Flow injection/sequential injection analysis (FIA/SIA systems are suitable for carrying out automatic wet chemical/biochemical reactions with reduced volume and time consumption. Various parts of the system such as pump, valve, and reactor may be built or adapted from available materials. Therefore the systems can be at lower cost as compared to other instrumentation-based analysis systems. Their applications for determination of biomarkers for liver diseases have been demonstrated in various formats of operation but only a few and limited types of biomarkers have been used as model analytes. This paper summarizes these applications for different types of reactions as a guide for using flow-based systems in more biomarker and/or multibiomarker studies.

  8. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  9. Simulating the Farm Production System Using the MONARC Simulation Tool

    Institute of Scientific and Technical Information of China (English)

    Y.Wu; I.C.Legrand; 等

    2001-01-01

    The simulation program developed by the "Models of Networked Analysis at Regional Centers"(MONARC) project is a powerful and flexible tool for simulating the behavior of large scale distributed computing systems,In this study,we further validate this simulation tool in a large-scale distributed farm computing system.We also report the usage of this simulation tool to identify the bottlenecks and limitations of our farm system.

  10. System of Objectified Judgement Analysis (SOJA) as a tool in rational and transparent drug-decision making.

    Science.gov (United States)

    Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob

    2007-10-01

    Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.

  11. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  12. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  13. Functional analysis, a resilience improvement tool applied to a waste management system – application to the "household waste management chain"

    Directory of Open Access Journals (Sweden)

    H. Beraud

    2012-12-01

    Full Text Available A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site.


    1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.. These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005.

  14. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  15. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  16. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  17. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices......: it is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future...

  18. Remote Sensing Image Analysis Without Expert Knowledge - A Web-Based Classification Tool On Top of Taverna Workflow Management System

    Science.gov (United States)

    Selsam, Peter; Schwartze, Christian

    2016-10-01

    Providing software solutions via internet has been known for quite some time and is now an increasing trend marketed as "software as a service". A lot of business units accept the new methods and streamlined IT strategies by offering web-based infrastructures for external software usage - but geospatial applications featuring very specialized services or functionalities on demand are still rare. Originally applied in desktop environments, the ILMSimage tool for remote sensing image analysis and classification was modified in its communicating structures and enabled for running on a high-power server and benefiting from Tavema software. On top, a GIS-like and web-based user interface guides the user through the different steps in ILMSimage. ILMSimage combines object oriented image segmentation with pattern recognition features. Basic image elements form a construction set to model for large image objects with diverse and complex appearance. There is no need for the user to set up detailed object definitions. Training is done by delineating one or more typical examples (templates) of the desired object using a simple vector polygon. The template can be large and does not need to be homogeneous. The template is completely independent from the segmentation. The object definition is done completely by the software.

  19. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    Science.gov (United States)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  20. Using the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  1. General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure, and...

  2. Rotary fast tool servo system and methods

    Science.gov (United States)

    Montesanti, Richard C.; Trumper, David L.

    2007-10-02

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. A pair of position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  3. FAILURE MODE EFFECTS AND CRITICALITY ANALYSIS (FMECA AS A QUALITY TOOL TO PLAN IMPROVEMENTS IN ULTRASONIC MOULD CLEANING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Cristiano Fragassa

    2016-12-01

    Full Text Available Inside the complex process used for tire production, ultrasonic cleaning treatment probably represents the best solution to preserve the functionality of tire moulds, by removing residuals from moulds and keeping an unaltered quality for their surfaces. Ultrasonic Mould Cleaning Systems (UMCS is, however, a complicated technology that combines ultrasonic waves, high temperature and a succession of acid and basic attacks. At the same time, an UMCS plant, as part of a long productive chain, has to guarantee the highest productivity reducing failures and maintenances. This article describes the use of Failure Mode Effects and Criticality Analysis (FMECA as a methodology for improving quality in cleaning process. In particular, FMECA was utilized to identify potential defects in the original plant design, to recognize the inner causes of some failures actually occurred during operations and, finally, to suggest definitive re-design actions. Changes were implemented and the new UMCS offers a better quality in term of higher availability and productivity.

  4. Educational Software Tool for Protection System Engineers. Distance Relay

    Directory of Open Access Journals (Sweden)

    Trujillo-Guajardo L.A.

    2012-04-01

    Full Text Available In this article, a graphical software tool is presented; this tool is based on the education of protection system engineers. The theoretical fundaments used for the design of operation characteristics of distance relays and their algorithms are presented. The software allows the evaluation and analysis of real time events or simulated ones of every stage of design of the distance relay. Some example cases are presented to illustrate the activities that could be done with the graphical software tool developed.

  5. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  6. System Identification Tools for Control Structure Interaction

    Science.gov (United States)

    1990-01-01

    DT! FILE COPY AL-TR-89-054 AD: 00 Final Report System Identification Tools for O for the period - September 1988 to Control Structure Interaction May...Classification) System Identification Tools for Control Structure Interaction (U) 12. PERSONAL AUTHOR(S) Kosut, Robert L.; Kabuli, Guntekin M. 13a. TYPE OF...identification, dynamics, 22 01 system identification , robustness, dynamic modeling, robust 22 02 control design, control design procedure 19. ABSTRACT

  7. 2010 Solar Market Transformation Analysis and Tools

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  8. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  9. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  10. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  11. New Tools for Hybrid Systems

    Science.gov (United States)

    2007-05-02

    IEEE Transactions on Automatic Control , vol. 48, no. 1, pp. 2-17, January 2003. [4] A. Megretski, "Robustness...34Output stabilizability of discrete-event dynamic systems", IEEE Transactions on Automatic Control , vol. 36, no.8, pp. 925-935, August 1991. [6] K...Passino, A. Michel and P. Antsaklis, "Lyapunov stability of a class of discrete event systems", IEEE Transactions on Automatic Control , vol. 39, no.

  12. Geographical information system (GIS) as a new tool to evaluate epidemiology based on spatial analysis and clinical outcomes in acromegaly.

    Science.gov (United States)

    Naves, Luciana Ansaneli; Porto, Lara Benigno; Rosa, João Willy Corrêa; Casulari, Luiz Augusto; Rosa, José Wilson Corrêa

    2015-02-01

    Geographical information systems (GIS) have emerged as a group of innovative software components useful for projects in epidemiology and planning in Health Care System. This is an original study to investigate environmental and geographical influences on epidemiology of acromegaly in Brazil. We aimed to validate a method to link an acromegaly registry with a GIS mapping program, to describe the spatial distribution of patients, to identify disease clusters and to evaluate if the access to Health Care could influence the outcome of the disease. Clinical data from 112 consecutive patients were collected and home addresses were plotted in the GIS software for spatial analysis. The buffer spatial distribution of patients living in Brasilia showed that 38.1% lived from 0.33 to 8.66 km, 17.7% from 8.67 to 18.06 km, 22.2% from 18.07 to 25.67 km and 22% from 25.68 to 36.70 km distant to the Reference Medical Center (RMC), and no unexpected clusters were identified. Migration of 26 patients from 11 others cities in different regions of the country was observed. Most of patients (64%) with adenomas bigger than 25 mm lived more than 20 km away from RMC, but no significant correlation between the distance from patient's home to the RMC and tumor diameter (r = 0.45 p = 0.20) nor for delay in diagnosis (r = 0.43 p = 0.30) was found. The geographical distribution of diagnosed cases did not impact in the latency of diagnosis or tumor size but the recognition of significant migration denotes that improvements in the medical assistance network are needed.

  13. The V-transform: A tool for analysis of control systems robustness with respect to disturbance model uncertainty

    Directory of Open Access Journals (Sweden)

    D. E. Davison

    1998-01-01

    , where G(s is the closed-loop transfer function from disturbance to output. Properties of V-transforms are investigated and the notions of disturbance gain margin and disturbance bandwidth margin, both measures of robustness with respect to disturbance model uncertainty, are introduced. Using these new tools, it is shown that there is a fundamental robustness performance limitation if the plant has nonminimum-phase zeros, but no such limitation in the minimum-phase case.

  14. Tools for Basic Statistical Analysis

    Science.gov (United States)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  15. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  16. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  17. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  18. Stochastic Simulation Tool for Aerospace Structural Analysis

    Science.gov (United States)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  19. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  20. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  1. Timeline analysis tools for law enforcement

    Science.gov (United States)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  2. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  3. Harnessing VLSI System Design with EDA Tools

    CERN Document Server

    Kamat, Rajanish K; Gaikwad, Pawan K; Guhilot, Hansraj

    2012-01-01

    This book explores various dimensions of EDA technologies for achieving different goals in VLSI system design. Although the scope of EDA is very broad and comprises diversified hardware and software tools to accomplish different phases of VLSI system design, such as design, layout, simulation, testability, prototyping and implementation, this book focuses only on demystifying the code, a.k.a. firmware development and its implementation with FPGAs. Since there are a variety of languages for system design, this book covers various issues related to VHDL, Verilog and System C synergized with EDA tools, using a variety of case studies such as testability, verification and power consumption. * Covers aspects of VHDL, Verilog and Handel C in one text; * Enables designers to judge the appropriateness of each EDA tool for relevant applications; * Omits discussion of design platforms and focuses on design case studies; * Uses design case studies from diversified application domains such as network on chip, hospital on...

  4. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  5. Geographical information system (GIS) as a new tool to evaluate epidemiology based on spatial analysis and clinical outcomes in acromegaly

    OpenAIRE

    Naves, Luciana Ansaneli; Porto, Lara Benigno; Rosa, João Willy Corrêa; Casulari, Luiz Augusto; Rosa, José Wilson Corrêa

    2013-01-01

    Geographical information systems (GIS) have emerged as a group of innovative software components useful for projects in epidemiology and planning in Health Care System. This is an original study to investigate environmental and geographical influences on epidemiology of acromegaly in Brazil. We aimed to validate a method to link an acromegaly registry with a GIS mapping program, to describe the spatial distribution of patients, to identify disease clusters and to evaluate if the access to Hea...

  6. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  7. Systems Thinking Tools for Improving Evidence-Based Practice: A Cross-Case Analysis of Two High School Leadership Teams

    Science.gov (United States)

    Kensler, Lisa A. W.; Reames, Ellen; Murray, John; Patrick, Lynne

    2012-01-01

    Teachers and administrators have access to large volumes of data but research suggests that they lack the skills to use data effectively for continuous school improvement. This study involved a cross-case analysis of two high school leadership teams' early stages of evidence-based practice development; differing forms of external support were…

  8. Software for systems biology: from tools to integrated platforms.

    Science.gov (United States)

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2011-11-03

    Understanding complex biological systems requires extensive support from software tools. Such tools are needed at each step of a systems biology computational workflow, which typically consists of data handling, network inference, deep curation, dynamical simulation and model analysis. In addition, there are now efforts to develop integrated software platforms, so that tools that are used at different stages of the workflow and by different researchers can easily be used together. This Review describes the types of software tools that are required at different stages of systems biology research and the current options that are available for systems biology researchers. We also discuss the challenges and prospects for modelling the effects of genetic changes on physiology and the concept of an integrated platform.

  9. Analysis on the multi-dimensional spectrum of the thrust force for the linear motor feed drive system in machine tools

    Science.gov (United States)

    Yang, Xiaojun; Lu, Dun; Ma, Chengfang; Zhang, Jun; Zhao, Wanhua

    2017-01-01

    The motor thrust force has lots of harmonic components due to the nonlinearity of drive circuit and motor itself in the linear motor feed drive system. What is more, in the motion process, these thrust force harmonics may vary with the position, velocity, acceleration and load, which affects the displacement fluctuation of the feed drive system. Therefore, in this paper, on the basis of the thrust force spectrum obtained by the Maxwell equation and the electromagnetic energy method, the multi-dimensional variation of each thrust harmonic is analyzed under different motion parameters. Then the model of the servo system is established oriented to the dynamic precision. The influence of the variation of the thrust force spectrum on the displacement fluctuation is discussed. At last the experiments are carried out to verify the theoretical analysis above. It can be found that the thrust harmonics show multi-dimensional spectrum characteristics under different motion parameters and loads, which should be considered to choose the motion parameters and optimize the servo control parameters in the high-speed and high-precision machine tools equipped with the linear motor feed drive system.

  10. General Analysis Tool Box for Controlled Perturbation

    CERN Document Server

    Osbild, Ralf

    2012-01-01

    The implementation of reliable and efficient geometric algorithms is a challenging task. The reason is the following conflict: On the one hand, computing with rounded arithmetic may question the reliability of programs while, on the other hand, computing with exact arithmetic may be too expensive and hence inefficient. One solution is the implementation of controlled perturbation algorithms which combine the speed of floating-point arithmetic with a protection mechanism that guarantees reliability, nonetheless. This paper is concerned with the performance analysis of controlled perturbation algorithms in theory. We answer this question with the presentation of a general analysis tool box. This tool box is separated into independent components which are presented individually with their interfaces. This way, the tool box supports alternative approaches for the derivation of the most crucial bounds. We present three approaches for this task. Furthermore, we have thoroughly reworked the concept of controlled per...

  11. A Survey of Export System Development Tools

    Science.gov (United States)

    1988-06-01

    f b x GESBT 4.0 62 r o m f b x x GETREE 26 r m f b GLIB 26 r GPSI 26 o GUESS/I 28 r o n m f b GURU 28 r m f b x x Hearsay-3 28 r HPRL 30 r o f b IN...93. GESBT 4.0 (Generic Expert System Building Tool) A- 3 - - - . -..m’ .,.A APPENDIX B Expert System Development Tools . B-1 ’ APPENDIX B p Expert... GESBT Knowledge acquisition:_________________________________ conflict detection ____________________________________ explicit rule entry X fact/control

  12. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  13. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  14. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  15. Cutting tool form compensation system and method

    Science.gov (United States)

    Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.

    1993-10-19

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.

  16. Cutting tool form compensaton system and method

    Science.gov (United States)

    Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.

    1993-01-01

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.

  17. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh;

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  18. [SIGAPS, a tool for the analysis of scientific publications].

    Science.gov (United States)

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  19. Design tools for complex dynamic security systems.

    Energy Technology Data Exchange (ETDEWEB)

    Byrne, Raymond Harry; Rigdon, James Brian; Rohrer, Brandon Robinson; Laguna, Glenn A.; Robinett, Rush D. III (.; ); Groom, Kenneth Neal; Wilson, David Gerald; Bickerstaff, Robert J.; Harrington, John J.

    2007-01-01

    The development of tools for complex dynamic security systems is not a straight forward engineering task but, rather, a scientific task where discovery of new scientific principles and math is necessary. For years, scientists have observed complex behavior but have had difficulty understanding it. Prominent examples include: insect colony organization, the stock market, molecular interactions, fractals, and emergent behavior. Engineering such systems will be an even greater challenge. This report explores four tools for engineered complex dynamic security systems: Partially Observable Markov Decision Process, Percolation Theory, Graph Theory, and Exergy/Entropy Theory. Additionally, enabling hardware technology for next generation security systems are described: a 100 node wireless sensor network, unmanned ground vehicle and unmanned aerial vehicle.

  20. MESS (Multi-purpose Exoplanet Simulation System): A Monte Carlo tool for the statistical analysis and prediction of exoplanets search results

    CERN Document Server

    Bonavita, M; Desidera, S; Gratton, R; Janson, M; Beuzit, J L; Kasper, M; Mordasini, C

    2011-01-01

    The high number of planet discoveries made in the last years provides a good sample for statistical analysis, leading to some clues on the distributions of planet parameters, like masses and periods, at least in close proximity to the host star. We likely need to wait for the extremely large telescopes (ELTs) to have an overall view of the extrasolar planetary systems. In this context it would be useful to have a tool that can be used for the interpretation of the present results,and also to predict what the outcomes would be of the future instruments. For this reason we built MESS: a Monte Carlo simulation code which uses either the results of the statistical analysis of the properties of discovered planets, or the results of the planet formation theories, to build synthetic planet populations fully described in terms of frequency, orbital elements and physical properties. They can then be used to either test the consistency of their properties with the observed population of planets given different detectio...

  1. Decision Making Support in Wastewater Management: Comparative analysis of techniques and tools used in centralized and decentralized system layouts UDK 628.2

    Directory of Open Access Journals (Sweden)

    Harmony Musiyarira

    2012-02-01

    Full Text Available Wastewater management has been seen primarily as a technical and economic issue but it is now recognised that these are some of the elements in an array of other factors that affect sustainability of wastewater systems. Literature studies point out that municipal authorities have a general and long-standing tradition of using indicators in monitoring performance, reviewing progress and reporting the state of the environment as part of the regulatory enacted compliance. However, they have neglected other critical aspects of use of these indicators such as their input into the planning and decision making process. This research advocates for the use of sustainable indicators in a context based planning approach and the utilisation of Multi Criteria Decision Aid (MCDA in a two step approach for comparative analysis and assessment of the sustainability of wastewater systems. The overall objective was to develop a methodology for wastewater systems selection and to produce a practical planning tool to aid in decision making for municipalities. Another objective was to provide recommendations for wastewater and sanitation management improvement in the case study area. The methodology consisted of comprehensive literature review, case study analysis, a review of the Decision Support Systems (DSS in use and the development of the DSS for Gauteng Province. The full spectrum of viable wastewater or sanitation options was incorporated into the DSS. From the sustainability assessments carried out using Multi criteria decision analysis, one result showed that varying degrees of sustainability are obtainable with each treatment technology involved and decentralised technologies appear more sustainable. Based on the local context and indicators used in this research, the DSS results suggest that land treatment systems, stabilisation ponds and ecological treatment methods are more sustainable. One major finding from literature is that no technology is

  2. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  3. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  4. Ethics Auditing and Conflict Analysis as Management Tools

    OpenAIRE

    2008-01-01

    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  5. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  6. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  7. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  8. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  9. DNA – A General Energy System Simulation Tool

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic...... operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also...

  10. A Collaborative Analysis Tool for Integrated Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald; Landrum, Brian

    2000-01-01

    the process may be repeated altering the trajectory or some other input to reduce the TPS mass. E-PSURBCC is an "engine performance" model and requires the specification of inlet air static temperature and pressure as well as Mach number (which it pulls from the HYFIM and POST trajectory files), and calculates the corresponding stagnation properties. The engine air flow path geometry includes inlet, a constant area section where the rocket is positioned, a subsonic diffuser, a constant area afterburner, and either a converging nozzle or a converging-diverging nozzle. The current capabilities of E-PSURBCC ejector and ramjet mode treatment indicated that various complex flow phenomena including multiple choking and internal shocks can occur for combinations of geometry/flow conditions. For a given input deck defining geometry/flow conditions, the program first goes through a series of checks to establish whether the input parameters are sound in terms of a solution path. If the vehicle/engine performance fails mission goals, the engineer is able to collaboratively alter the vehicle moldline to change aerodynamics, or trajectory, or some other input to achieve orbit. The problem described is an example of the need for collaborative design and analysis. RECIPE is a cross-platform application capable of hosting a number of engineers and designers across the Internet for distributed and collaborative engineering environments. Such integrated system design environments allow for collaborative team design analysis for performing individual or reduced team studies. To facilitate the larger number of potential runs that may need to be made, RECIPE connects the computer codes that calculate the trajectory data, aerodynamic data based on vehicle geometry, heat rate data, TPS masses, and vehicle and engine performance, so that the output from each tool is easily transferred to the model input files that need it.

  11. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  12. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  13. RSAT 2015: Regulatory Sequence Analysis Tools.

    Science.gov (United States)

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  14. Spatial Analysis in Educational Administration: Exploring the Role of G.I.S. (Geographical Information Systems) as an Evaluative Tool in the Public School Board Setting.

    Science.gov (United States)

    Brown, Robert S.; Baird, William; Rosolen, Lisa

    In January 1998, seven school boards amalgamated to form the Toronto District School Board, a board responsible for 600 schools. To deal with the complexities of the new entity, researchers have been using geographical information systems (GIS). GIS are computer-based tools for mapping. They store information as a collection of thematic layers or…

  15. Anvil Tool in the Advanced Weather Interactive Processing System

    Science.gov (United States)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  16. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)).

  17. Performance Analysis of Anti-Phishing Tools and Study of Classification Data Mining Algorithms for a Novel Anti-Phishing System

    Directory of Open Access Journals (Sweden)

    Rajendra Gupta

    2015-11-01

    Full Text Available The term Phishing is a kind of spoofing website which is used for stealing sensitive and important information of the web user such as online banking passwords, credit card information and user's password etc. In the phishing attack, the attacker generates the warning message to the user about the security issues, ask for confidential information through phishing emails, ask to update the user's account information etc. Several experimental design considerations have been proposed earlier to countermeasure the phishing attack. The earlier systems are not giving more than 90 percentage successful results. In some cases, the system tool gives only 50-60 percentage successful result. In this paper, a novel algorithm is developed to check the performance of the anti-phishing system and compared the received data set with the data set of existing anti-phishing tools. The performance evaluation of novel anti-phishing system is studied with four different classification data mining algorithms which are Class Imbalance Problem (CIP, Rule based Classifier (Sequential Covering Algorithm (SCA, Nearest Neighbour Classification (NNC, Bayesian Classifier (BC on the data set of phishing and legitimate websites. The proposed system shows less error rate and better performance as compared to other existing system tools.

  18. Comparative guide to emerging diagnostic tools for large commercial HVAC systems

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, Hannah; Piette, Mary Ann

    2001-05-01

    This guide compares emerging diagnostic software tools that aid detection and diagnosis of operational problems for large HVAC systems. We have evaluated six tools for use with energy management control system (EMCS) or other monitoring data. The diagnostic tools summarize relevant performance metrics, display plots for manual analysis, and perform automated diagnostic procedures. Our comparative analysis presents nine summary tables with supporting explanatory text and includes sample diagnostic screens for each tool.

  19. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-02-21

    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed and simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.

  20. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  1. Tool, weapon, or white elephant? A realist analysis of the five phases of a twenty-year programme of occupational health information system implementation in the health sector

    Directory of Open Access Journals (Sweden)

    Spiegel Jerry M

    2012-08-01

    Full Text Available Abstract Background Although information systems (IS have been extensively applied in the health sector worldwide, few initiatives have addressed the health and safety of health workers, a group acknowledged to be at high risk of injury and illness, as well as in great shortage globally, particularly in low and middle-income countries. Methods Adapting a context-mechanism-outcome case study design, we analyze our team’s own experience over two decades to address this gap: in two different Canadian provinces; and two distinct South African settings. Applying a realist analysis within an adapted structuration theory framing sensitive to power relations, we explore contextual (socio-political and technological characteristics and mechanisms affecting outcomes at micro, meso and macro levels. Results Technological limitations hindered IS usefulness in the initial Canadian locale, while staffing inadequacies amid pronounced power imbalances affecting governance restricted IS usefulness in the subsequent Canadian application. Implementation in South Africa highlighted the special care needed to address power dynamics regarding both worker-employer relations (relevant to all occupational health settings and North–south imbalances (common to all international interactions. Researchers, managers and front-line workers all view IS implementation differently; relationships amongst the workplace parties and between community and academic partners have been pivotal in determining outcome in all circumstances. Capacity building and applying creative commons and open source solutions are showing promise, as is international collaboration. Conclusions There is worldwide consensus on the need for IS use to protect the health workforce. However, IS implementation is a resource-intensive undertaking; regardless of how carefully designed the software, contextual factors and the mechanisms adopted to address these are critical to mitigate threats and achieve

  2. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  3. SABRE: A Tool for Stochastic Analysis of Biochemical Reaction Networks

    CERN Document Server

    Didier, Frederic; Mateescu, Maria; Wolf, Verena

    2010-01-01

    The importance of stochasticity within biological systems has been shown repeatedly during the last years and has raised the need for efficient stochastic tools. We present SABRE, a tool for stochastic analysis of biochemical reaction networks. SABRE implements fast adaptive uniformization (FAU), a direct numerical approximation algorithm for computing transient solutions of biochemical reaction networks. Biochemical reactions networks represent biological systems studied at a molecular level and these reactions can be modeled as transitions of a Markov chain. SABRE accepts as input the formalism of guarded commands, which it interprets either as continuous-time or as discrete-time Markov chains. Besides operating in a stochastic mode, SABRE may also perform a deterministic analysis by directly computing a mean-field approximation of the system under study. We illustrate the different functionalities of SABRE by means of biological case studies.

  4. Software Tools to Support the Assessment of System Health

    Science.gov (United States)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  5. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  6. Information Systems as Dialectic, Tool - Mediated Activity

    Directory of Open Access Journals (Sweden)

    Helen Hasan

    2002-11-01

    Full Text Available Information Systems (IS draws its significance from the uniqueness of computer-based information and communications tools and their place in shaping recent human history. Advances in the field come from a better understanding of how to develop and use these tools and what impact they have on the way we work, and live. As IS is still an evolving field of study, two views, the objective and the subjective, are in constant tension and, though these may be considered complementary, it is rare that they come together as a unified whole. A more balanced, and integrated, foundation for IS may be found in the subject-object dialectic arising out of the German philosophical tradition. An extension of this approach from the Cultural-Historical Activity Theory is presented in this paper. This theory views all human endeavour as a purposeful, dynamic, dialectic relationship between subject and object, mediated by tools, such as technology and information, and by the social environment or community. An adaptation of this holistic theory, to incorporate the best of other theoretical approaches used by IS researchers, could span and integrate the breadth of the field IS providing it with unity and identity.

  7. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  8. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  9. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  10. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  11. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Directory of Open Access Journals (Sweden)

    Mihaela-Laura IVAN

    2016-05-01

    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.

  12. Statistical spectroscopic tools for biomarker discovery and systems medicine.

    Science.gov (United States)

    Robinette, Steven L; Lindon, John C; Nicholson, Jeremy K

    2013-06-04

    Metabolic profiling based on comparative, statistical analysis of NMR spectroscopic and mass spectrometric data from complex biological samples has contributed to increased understanding of the role of small molecules in affecting and indicating biological processes. To enable this research, the development of statistical spectroscopy has been marked by early beginnings in applying pattern recognition to nuclear magnetic resonance data and the introduction of statistical total correlation spectroscopy (STOCSY) as a tool for biomarker identification in the past decade. Extensions of statistical spectroscopy now compose a family of related tools used for compound identification, data preprocessing, and metabolic pathway analysis. In this Perspective, we review the theory and current state of research in statistical spectroscopy and discuss the growing applications of these tools to medicine and systems biology. We also provide perspectives on how recent institutional initiatives are providing new platforms for the development and application of statistical spectroscopy tools and driving the development of integrated "systems medicine" approaches in which clinical decision making is supported by statistical and computational analysis of metabolic, phenotypic, and physiological data.

  13. Bio-TDS: bioscience query tool discovery system.

    Science.gov (United States)

    Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M

    2017-01-04

    Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process.

  14. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  15. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    Science.gov (United States)

    Clough, D.; Fletcher, S.; Longstaff, A. P.; Willoughby, P.

    2012-05-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  16. Healthcare BI: a tool for meaningful analysis.

    Science.gov (United States)

    Rohloff, Rose

    2011-05-01

    Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting.

  17. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  18. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  19. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  20. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  1. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  2. ANN Based Tool Condition Monitoring System for CNC Milling Machines

    Directory of Open Access Journals (Sweden)

    Mota-Valtierra G.C.

    2011-10-01

    Full Text Available Most of the companies have as objective to manufacture high-quality products, then by optimizing costs, reducing and controlling the variations in its production processes it is possible. Within manufacturing industries a very important issue is the tool condition monitoring, since the tool state will determine the quality of products. Besides, a good monitoring system will protect the machinery from severe damages. For determining the state of the cutting tools in a milling machine, there is a great variety of models in the industrial market, however these systems are not available to all companies because of their high costs and the requirements of modifying the machining tool in order to attach the system sensors. This paper presents an intelligent classification system which determines the status of cutt ers in a Computer Numerical Control (CNC milling machine. This tool state is mainly detected through the analysis of the cutting forces drawn from the spindle motors currents. This monitoring system does not need sensors so it is no necessary to modify the machine. The correct classification is made by advanced digital signal processing techniques. Just after acquiring a signal, a FIR digital filter is applied to the data to eliminate the undesired noisy components and to extract the embedded force components. A Wavelet Transformation is applied to the filtered signal in order to compress the data amount and to optimize the classifier structure. Then a multilayer perceptron- type neural network is responsible for carrying out the classification of the signal. Achieving a reliability of 95%, the system is capable of detecting breakage and a worn cutter.

  3. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    Science.gov (United States)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  4. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  5. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  6. Systems biology: A tool for charting the antiviral landscape.

    Science.gov (United States)

    Bowen, James R; Ferris, Martin T; Suthar, Mehul S

    2016-06-15

    The host antiviral programs that are initiated following viral infection form a dynamic and complex web of responses that we have collectively termed as "the antiviral landscape". Conventional approaches to studying antiviral responses have primarily used reductionist systems to assess the function of a single or a limited subset of molecules. Systems biology is a holistic approach that considers the entire system as a whole, rather than individual components or molecules. Systems biology based approaches facilitate an unbiased and comprehensive analysis of the antiviral landscape, while allowing for the discovery of emergent properties that are missed by conventional approaches. The antiviral landscape can be viewed as a hierarchy of complexity, beginning at the whole organism level and progressing downward to isolated tissues, populations of cells, and single cells. In this review, we will discuss how systems biology has been applied to better understand the antiviral landscape at each of these layers. At the organismal level, the Collaborative Cross is an invaluable genetic resource for assessing how genetic diversity influences the antiviral response. Whole tissue and isolated bulk cell transcriptomics serves as a critical tool for the comprehensive analysis of antiviral responses at both the tissue and cellular levels of complexity. Finally, new techniques in single cell analysis are emerging tools that will revolutionize our understanding of how individual cells within a bulk infected cell population contribute to the overall antiviral landscape.

  7. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  8. 3D-Aided-Analysis Tool for Lunar Rover

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; LI Guo-peng; REN Xin; LIU Jian-jun; GAO Xing-ye; ZOU Xiao-duan

    2013-01-01

    3D-Aided-Analysis Tool (3DAAT) which is a virtual reality system is built up in this paper. 3DAAT is integrated with kinematics and dynamics model of rover as well as real lunar surface terrain mode. Methods of modeling which are proposed in this paper include constructing lunar surface, constructing 3D model of lander and rover, building up kinematic model of rover body. Photogrammetry technique and the remote sensing information are used to generate the terrain model of lunar surface. According to the implementation result, 3DAAT is an effective assist system for making exploration plan and analyzing the status of rover.

  9. T4SP Database 2.0: An Improved Database for Type IV Secretion Systems in Bacterial Genomes with New Online Analysis Tools

    Science.gov (United States)

    Han, Na; Yu, Weiwen; Qiang, Yujun

    2016-01-01

    Type IV secretion system (T4SS) can mediate the passage of macromolecules across cellular membranes and is essential for virulent and genetic material exchange among bacterial species. The Type IV Secretion Project 2.0 (T4SP 2.0) database is an improved and extended version of the platform released in 2013 aimed at assisting with the detection of Type IV secretion systems (T4SS) in bacterial genomes. This advanced version provides users with web server tools for detecting the existence and variations of T4SS genes online. The new interface for the genome browser provides a user-friendly access to the most complete and accurate resource of T4SS gene information (e.g., gene number, name, type, position, sequence, related articles, and quick links to other webs). Currently, this online database includes T4SS information of 5239 bacterial strains. Conclusions. T4SS is one of the most versatile secretion systems necessary for the virulence and survival of bacteria and the secretion of protein and/or DNA substrates from a donor to a recipient cell. This database on virB/D genes of the T4SS system will help scientists worldwide to improve their knowledge on secretion systems and also identify potential pathogenic mechanisms of various microbial species.

  10. A Collaborative Analysis Tool for Integrating Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald

    1999-01-01

    Presented is a computer-based tool that connects several disciplines that are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to every other system, as is the case of SSTO vehicles with air breathing propulsion, which is currently being studied by NASA. The deficiencies in the scramjet powered concept led to a revival of interest in Rocket-Based Combined-Cycle (RBCC) propulsion systems. An RBCC propulsion system integrates airbreathing and rocket propulsion into a single engine assembly enclosed within a cowl or duct. A typical RBCC propulsion system operates as a ducted rocket up to approximately Mach 3. At this point the transitions to a ramjet mode for supersonic-to-hypersonic acceleration. Around Mach 8 the engine transitions to a scram4jet mode. During the ramjet and scramjet modes, the integral rockets operate as fuel injectors. Around Mach 10-12 (the actual value depends on vehicle and mission requirements), the inlet is physically closed and the engine transitions to an integral rocket mode for orbit insertion. A common feature of RBCC propelled vehicles is the high degree of integration between the propulsion system and airframe. At high speeds the vehicle forebody is fundamentally part of the engine inlet, providing a compression surface for air flowing into the engine. The compressed air is mixed with fuel and burned. The combusted mixture must be expanded to an area larger than the incoming stream to provide thrust. Since a conventional nozzle would be too large, the entire lower after body of the vehicle is used as an expansion surface. Because of the high external temperatures seen during atmospheric flight, the design of an airbreathing SSTO vehicle requires delicate tradeoffs between engine design, vehicle shape, and thermal protection system (TPS) sizing in order to produce an optimum system in terms of weight (and cost) and maximum performance.

  11. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  12. IRSS: a web-based tool for automatic layout and analysis of IRES secondary structure prediction and searching system in silico

    Directory of Open Access Journals (Sweden)

    Hong Jun-Jie

    2009-05-01

    Full Text Available Abstract Background Internal ribosomal entry sites (IRESs provide alternative, cap-independent translation initiation sites in eukaryotic cells. IRES elements are important factors in viral genomes and are also useful tools for bi-cistronic expression vectors. Most existing RNA structure prediction programs are unable to deal with IRES elements. Results We designed an IRES search system, named IRSS, to obtain better results for IRES prediction. RNA secondary structure prediction and comparison software programs were implemented to construct our two-stage strategy for the IRSS. Two software programs formed the backbone of IRSS: the RNAL fold program, used to predict local RNA secondary structures by minimum free energy method; and the RNA Align program, used to compare predicted structures. After complete viral genome database search, the IRSS have low error rate and up to 72.3% sensitivity in appropriated parameters. Conclusion IRSS is freely available at this website http://140.135.61.9/ires/. In addition, all source codes, precompiled binaries, examples and documentations are downloadable for local execution. This new search approach for IRES elements will provide a useful research tool on IRES related studies.

  13. Chip breaking system for automated machine tool

    Science.gov (United States)

    Arehart, Theodore A.; Carey, Donald O.

    1987-01-01

    The invention is a rotary selectively directional valve assembly for use in an automated turret lathe for directing a stream of high pressure liquid machining coolant to the interface of a machine tool and workpiece for breaking up ribbon-shaped chips during the formation thereof so as to inhibit scratching or other marring of the machined surfaces by these ribbon-shaped chips. The valve assembly is provided by a manifold arrangement having a plurality of circumferentially spaced apart ports each coupled to a machine tool. The manifold is rotatable with the turret when the turret is positioned for alignment of a machine tool in a machining relationship with the workpiece. The manifold is connected to a non-rotational header having a single passageway therethrough which conveys the high pressure coolant to only the port in the manifold which is in registry with the tool disposed in a working relationship with the workpiece. To position the machine tools the turret is rotated and one of the tools is placed in a material-removing relationship of the workpiece. The passageway in the header and one of the ports in the manifold arrangement are then automatically aligned to supply the machining coolant to the machine tool workpiece interface for breaking up of the chips as well as cooling the tool and workpiece during the machining operation.

  14. Analysis and use of OLAP tools in the corporative information system of CFE; Analisis y empleo de herramientas OLAP en el sistema de informacion corporativa de la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Jacome G, Norma E; Argotte R, Liliana P; Mejia L, Manuel [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2003-07-01

    The commercial tools Oracle Express and Oracle the Discoverer are presented, applied and compared. Both of them were applied in the context of the Data Storage that Comision Federal de Electricidad (CFE) develops, denominated Corporative Information System (SICORP) that involves the handling of large volumes of historical and present information. By the importance and the relevance that at the moment the subject of data storage has, the experiences that are described in the article are very useful for future developments. [Spanish] Se presentan, aplican y comparan las herramientas comerciales Oracle Express y Oracle Discoverer. Ambas fueron aplicadas en el contexto del Almacen de Datos que la Comision Federal de Electricidad (CFE) desarrolla, denominado Sistema de Informacion Corporativa (Sicorp), que involucra el manejo de grandes volumenes de informacion historica y actual. Por la importancia y el auge que actualmente tiene el tema de almacenes de datos, las experiencias que se describen en el articulo son de gran utilidad para futuros desarrollos.

  15. The Integrated Air Transportation System Evaluation Tool

    Science.gov (United States)

    Wingrove, Earl R., III; Hees, Jing; Villani, James A.; Yackovetsky, Robert E. (Technical Monitor)

    2002-01-01

    Throughout U.S. history, our nation has generally enjoyed exceptional economic growth, driven in part by transportation advancements. Looking forward 25 years, when the national highway and skyway systems are saturated, the nation faces new challenges in creating transportation-driven economic growth and wealth. To meet the national requirement for an improved air traffic management system, NASA developed the goal of tripling throughput over the next 20 years, in all weather conditions while maintaining safety. Analysis of the throughput goal has primarily focused on major airline operations, primarily through the hub and spoke system.However, many suggested concepts to increase throughput may operate outside the hub and spoke system. Examples of such concepts include the Small Aircraft Transportation System, civil tiltrotor, and improved rotorcraft. Proper assessment of the potential contribution of these technologies to the domestic air transportation system requires a modeling capability that includes the country's numerous smaller airports, acting as a fundamental component of the National Air space System, and the demand for such concepts and technologies. Under this task for NASA, the Logistics Management Institute developed higher fidelity demand models that capture the interdependence of short-haul air travel with other transportation modes and explicitly consider the costs of commercial air and other transport modes. To accomplish this work, we generated forecasts of the distribution of general aviation based aircraft and GA itinerant operations at each of nearly 3.000 airport based on changes in economic conditions and demographic trends. We also built modules that estimate the demand for travel by different modes, particularly auto, commercial air, and GA. We examined GA demand from two perspectives: top-down and bottom-up, described in detail.

  16. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  17. Tool management in manufacturing systems equipped with CNC machines

    Directory of Open Access Journals (Sweden)

    Giovanni Tani

    1997-12-01

    Full Text Available This work has been carried out for the purpose of realizing an automated system for the integrated management of tools within a company. By integrating planning, inspection and tool-room functions, automated tool management can ensure optimum utilization of tools on the selected machines, guaranteeing their effective availability. The first stage of the work consisted of defining and developing a Tool Management System whose central nucleus is a unified Data Base for all of the tools, forming part of the company's Technological Files (files on machines, materials, equipment, methods, etc., interfaceable with all of the company departments that require information on tools. The system assigns code numbers to the individual components of the tools and file them on the basis of their morphological and functional characteristics. The system is also designed to effect assemblies of tools, from which are obtained the "Tool Cards" required for compiling working cycles (CAPP, for CAM programming and for the Tool-room where the tools are physically prepared. Methods for interfacing with suitable systems for the aforesaid functions have also been devised

  18. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  19. Design package for fuel retrieval system fuel handling tool modification

    Energy Technology Data Exchange (ETDEWEB)

    TEDESCHI, D.J.

    1999-03-17

    This is a design package that contains the details for a modification to a tool used for moving fuel elements during loading of MCO Fuel Baskets for the Fuel Retrieval System. The tool is called the fuel handling tool (or stinger). This document contains requirements, development design information, tests, and test reports.

  20. Design package for fuel retrieval system fuel handling tool modification

    Energy Technology Data Exchange (ETDEWEB)

    TEDESCHI, D.J.

    1998-11-09

    This is a design package that contains the details for a modification to a tool used for moving fuel elements during loading of MCO Fuel Baskets for the Fuel Retrieval System. The tool is called the fuel handling tool (or stinger). This document contains requirements, development design information, tests, and test reports.

  1. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  2. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  3. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  4. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  5. Shot planning and analysis tools on the NIF project

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, R. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Casey, A., E-mail: casey20@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Conder, A.; Fallejo, R.; Flegel, M.; Hutton, M.; Jancaitis, K.; Lakamsani, V.; Potter, D.; Reisdorf, S.; Tappero, J.; Whitman, P.; Carr, W.; Liao, Z. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Target shots in NIF, dozens a month, vary widely in laser and target configuration. Black-Right-Pointing-Pointer A planning tool helps select shot sequences that optimize valuable facility time. Black-Right-Pointing-Pointer Fabrication and supply of targets, diagnostics, etc. are integrated into the plan. Black-Right-Pointing-Pointer Predictive modeling of aging parts (e.g., optics) aids maintenance decision support. Black-Right-Pointing-Pointer We describe the planning/analysis tool and its use in NIF experimental operations. - Abstract: Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Campaign Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modeling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics

  6. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  7. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  8. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  9. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  10. Development of database and searching system for tool grinding

    Directory of Open Access Journals (Sweden)

    J.Y. Chen

    2008-02-01

    Full Text Available Purpose: For achieving the goal of saving time on the tool grinding and design, an efficient method of developing the data management and searching system for the standard cutting tools is proposed in this study.Design/methodology/approach: At first the tool grinding software with open architecture was employed to design and plan grinding processes for seven types of tools. According to the characteristics of tools (e.g. types, diameter, radius and so on, 4802 tool data were established in the relational database. Then, the SQL syntax was utilized to write the searching algorithms, and the human machine interfaces of the searching system for the tool database were developed by C++ Builder.Findings: For grinding a square end mill with two-flute, a half of time on the tool design and the change of production line for grinding other types of tools can be saved by means of our system. More specifically, the efficiency in terms of the approach and retract time was improved up to 40%, and an improvement of approximately 10.6% in the overall machining time can be achieved.Research limitations/implications: In fact, the used tool database in this study only includes some specific tools such as the square end mill. The step drill, taper tools, and special tools can also be taken into account in the database for future research.Practical implications: The most commercial tool grinding software is the modular-based design and use tool shapes to construct the CAM interface. Some limitations on the tool design are undesirable for customers. On the contrary, employing not only the grinding processes to construct the grinding path of tools but the searching system combined with the grinding software, it gives more flexible for one to design new tools.Originality/value: A novel tool database and searching system is presented for tool grinding. Using this system can save time and provide more convenience on designing tools and grinding. In other words, the

  11. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  12. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  13. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  14. Expert Systems as Tools for Technical Communicators.

    Science.gov (United States)

    Grider, Daryl A.

    1994-01-01

    Discusses expertise, what an expert system is, what an expert system shell is, what expert systems can and cannot do, knowledge engineering and technical communicators, and planning and managing expert system projects. (SR)

  15. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  16. GLIDER: Free tool imagery data visualization, analysis and mining

    Science.gov (United States)

    Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.

    2009-12-01

    Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis

  17. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  18. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad;

    2010-01-01

    of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...... to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...

  19. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  20. A Components Library System Model and the Support Tool

    Institute of Scientific and Technical Information of China (English)

    MIAO Huai-kou; LIU Hui; LIU Jing; LI Xiao-bo

    2004-01-01

    Component-based development needs a well-designed components library and a set of support tools.This paper presents the design and implementation of a components library system model and its support tool UMLCASE.A set of practical CASE tools is constructed.UMLCASE can use UML to design Use Case Diagram, Class Diagram etc.And it integrates with components library system.

  1. Dynamic drag force based on iterative density mapping: A new numerical tool for three-dimensional analysis of particle trajectories in a dielectrophoretic system.

    Science.gov (United States)

    Knoerzer, Markus; Szydzik, Crispin; Tovar-Lopez, Francisco Javier; Tang, Xinke; Mitchell, Arnan; Khoshmanesh, Khashayar

    2016-02-01

    Dielectrophoresis is a widely used means of manipulating suspended particles within microfluidic systems. In order to efficiently design such systems for a desired application, various numerical methods exist that enable particle trajectory plotting in two or three dimensions based on the interplay of hydrodynamic and dielectrophoretic forces. While various models are described in the literature, few are capable of modeling interactions between particles as well as their surrounding environment as these interactions are complex, multifaceted, and computationally expensive to the point of being prohibitive when considering a large number of particles. In this paper, we present a numerical model designed to enable spatial analysis of the physical effects exerted upon particles within microfluidic systems employing dielectrophoresis. The model presents a means of approximating the effects of the presence of large numbers of particles through dynamically adjusting hydrodynamic drag force based on particle density, thereby introducing a measure of emulated particle-particle and particle-liquid interactions. This model is referred to as "dynamic drag force based on iterative density mapping." The resultant numerical model is used to simulate and predict particle trajectory and velocity profiles within a microfluidic system incorporating curved dielectrophoretic microelectrodes. The simulated data are compared favorably with experimental data gathered using microparticle image velocimetry, and is contrasted against simulated data generated using traditional "effective moment Stokes-drag method," showing more accurate particle velocity profiles for areas of high particle density.

  2. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  3. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order......We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... to ensure the finiteness of the protocol state-spaces while still being able to verify interesting protocol properties. The translations for different kinds of communication media have been implemented and successfully tested, among others, on agreement protocols from WS-Business Activity....

  4. Techniques and tools for efficiently modeling multiprocessor systems

    Science.gov (United States)

    Carpenter, T.; Yalamanchili, S.

    1990-01-01

    System-level tools and methodologies associated with an integrated approach to the development of multiprocessor systems are examined. Tools for capturing initial program structure, automated program partitioning, automated resource allocation, and high-level modeling of the combined application and resource are discussed. The primary language focus of the current implementation is Ada, although the techniques should be appropriate for other programming paradigms.

  5. Verified System Development with the AutoFocus Tool Chain

    OpenAIRE

    Maria Spichkova; Florian Hölzl; David Trachtenherz

    2012-01-01

    This work presents a model-based development methodology for verified software systems as well as a tool support for it: an applied AutoFocus tool chain and its basic principles emphasizing the verification of the system under development as well as the check mechanisms we used to raise the level of confidence in the correctness of the implementation of the automatic generators.

  6. Hybrid-Electric Aircraft TOGW Development Tool with Empirically-Based Airframe and Physics-Based Hybrid Propulsion System Component Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Hybrid-Electric distributed propulsion (HEDP) is becoming widely accepted and new tools will be required for future development. This Phase I SBIR proposal creates a...

  7. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  8. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  9. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  10. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development.

  11. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  12. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . An example of a prototype for a digital conceptual design tool with integrated real time structural analysis is presented and compared with a more common Building Information Modelling (BIM) approach. It is concluded that a digital conceptual design tool with embedded real time structural analysis could......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  13. A Lexical Analysis Tool with Ambiguity Support

    CERN Document Server

    Quesada, Luis; Cortijo, Francisco J

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  14. Fully Parallel MHD Stability Analysis Tool

    Science.gov (United States)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  15. Multi-tool design and analysis of an automotive HUD

    Science.gov (United States)

    Irving, Bruce; Hasenauer, David; Mulder, Steve

    2016-10-01

    Design and analysis of an optical system is often a multidisciplinary task, and can involve the use of specialized software packages for imaging, mechanics, and illumination. This paper will present a case study on the design and analysis of a basic heads-up display (HUD) for automotive use. The emphasis will be on the special requirements of a HUD visual system and on the tools and techniques needed to accomplish the design. The first section of this paper will present an overview of the imaging design using commercially available imaging design software. Topics addressed in this section include modeling the windshield, visualizing the imaging performance, using constraints and freeform surfaces to improve the system, and meeting specific visual performance specifications with design/analysis methods. The second section will address the use of a CAD program to design a basic mechanical structure to support and protect the optics. This section will also discuss some of the issues and limitations involved in translating data between a CAD program and a lens design or illumination program. Typical issues that arise include the precision of optical surface prescriptions, surface and material properties, and the management of large data files. In the final section, the combined optical and mechanical package will be considered, using an illumination design program for stray light analysis. The stray light analysis will be directed primarily toward finding, visualizing, and quantifying unexpected ray paths. Techniques for sorting optical ray paths by path length, power, and elements or materials encountered will be discussed, along with methods for estimating the impact of stray light on the optical system performance.

  16. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  17. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  18. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  19. A tool for searching in information systems under uncertainty

    Science.gov (United States)

    Walek, Bogdan; Farana, Radim

    2016-06-01

    This article deals with a design of a tool for searching in information systems under uncertainty. During the search, user data often works with uncertainty, which may lead to a lack of the desired result or to find a large number of results that the user must evaluate. The main goal of the proposed tool is to process vague information and find relevant data. The article describes in detail various steps of the proposed tool.

  20. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  1. LabSystem Gen, a tool for structuring and analyzing genetic data in histocompatibility laboratories.

    Science.gov (United States)

    Sousa, Luiz Cláudio Demes da Mata; dos Santos Neto, Pedro de Alcântara; de Souza, Fernando da Fonseca; do Monte, Semiramis Jamil Hadad

    2012-04-01

    Analysis of HLA data involves queries on web portals, whose search parameters are data stored in laboratories' databases. In order to automate these queries, one approach is to structure laboratory data into a database and to develop bioinformatic tools to perform the data mapping. In this context, we developed the LabSystem Gen tool that allows users to create a Laboratory Information System, without programming. Additionally we implemented a framework that provides bioinformatic tools, transparent access to public HLA (human leukocyte antigen) information resources. We demonstrated the LabSystemGen system by implementing BMDdb, which is a LIMS that manages data of recipients and donors of organ transplant.

  2. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  3. Active ultrasound pattern injection system (AUSPIS for interventional tool guidance.

    Directory of Open Access Journals (Sweden)

    Xiaoyu Guo

    Full Text Available Accurate tool tracking is a crucial task that directly affects the safety and effectiveness of many interventional medical procedures. Compared to CT and MRI, ultrasound-based tool tracking has many advantages, including low cost, safety, mobility and ease of use. However, surgical tools are poorly visualized in conventional ultrasound images, thus preventing effective tool tracking and guidance. Existing tracking methods have not yet provided a solution that effectively solves the tool visualization and mid-plane localization accuracy problem and fully meets the clinical requirements. In this paper, we present an active ultrasound tracking and guiding system for interventional tools. The main principle of this system is to establish a bi-directional ultrasound communication between the interventional tool and US imaging machine within the tissue. This method enables the interventional tool to generate an active ultrasound field over the original imaging ultrasound signals. By controlling the timing and amplitude of the active ultrasound field, a virtual pattern can be directly injected into the US machine B mode display. In this work, we introduce the time and frequency modulation, mid-plane detection, and arbitrary pattern injection methods. The implementation of these methods further improves the target visualization and guiding accuracy, and expands the system application beyond simple tool tracking. We performed ex vitro and in vivo experiments, showing significant improvements of tool visualization and accurate localization using different US imaging platforms. An ultrasound image mid-plane detection accuracy of ±0.3 mm and a detectable tissue depth over 8.5 cm was achieved in the experiment. The system performance is tested under different configurations and system parameters. We also report the first experiment of arbitrary pattern injection to the B mode image and its application in accurate tool tracking.

  4. Active ultrasound pattern injection system (AUSPIS) for interventional tool guidance.

    Science.gov (United States)

    Guo, Xiaoyu; Kang, Hyun-Jae; Etienne-Cummings, Ralph; Boctor, Emad M

    2014-01-01

    Accurate tool tracking is a crucial task that directly affects the safety and effectiveness of many interventional medical procedures. Compared to CT and MRI, ultrasound-based tool tracking has many advantages, including low cost, safety, mobility and ease of use. However, surgical tools are poorly visualized in conventional ultrasound images, thus preventing effective tool tracking and guidance. Existing tracking methods have not yet provided a solution that effectively solves the tool visualization and mid-plane localization accuracy problem and fully meets the clinical requirements. In this paper, we present an active ultrasound tracking and guiding system for interventional tools. The main principle of this system is to establish a bi-directional ultrasound communication between the interventional tool and US imaging machine within the tissue. This method enables the interventional tool to generate an active ultrasound field over the original imaging ultrasound signals. By controlling the timing and amplitude of the active ultrasound field, a virtual pattern can be directly injected into the US machine B mode display. In this work, we introduce the time and frequency modulation, mid-plane detection, and arbitrary pattern injection methods. The implementation of these methods further improves the target visualization and guiding accuracy, and expands the system application beyond simple tool tracking. We performed ex vitro and in vivo experiments, showing significant improvements of tool visualization and accurate localization using different US imaging platforms. An ultrasound image mid-plane detection accuracy of ±0.3 mm and a detectable tissue depth over 8.5 cm was achieved in the experiment. The system performance is tested under different configurations and system parameters. We also report the first experiment of arbitrary pattern injection to the B mode image and its application in accurate tool tracking.

  5. Knowledge-based decision support system for tool management in flexible manufacturing system

    Institute of Scientific and Technical Information of China (English)

    周炳海; 奚立峰; 蔡建国

    2004-01-01

    Tool management is not a single, simple activity, it is comprised of a complex set of functions, especially in a flexible manufacturing system (FMS) environment. The issues associated with tool management include tool requirement planning, tool real-time scheduling, tool crib management, tool inventory control, tool fault diagnosis, tool tracking and tool monitoring. In order to make tools flow into/out of FMS efficiently, this work is aimed to design a knowledge-based decision support system (KBDSS) for tool management in FMS. Firstly an overview of tool management functions is described. Then the structure of KBDSS for tool management and the essential agents in the design of KBDSS are presented. Finally the individual agents of KBDSS are discussed for design and development.

  6. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  7. Building Systems: Passing Fad or Basic Tool?

    Science.gov (United States)

    Rezab, Donald

    Building systems can be traced back to a 1516 A.D. project by Leonardo da Vinci and to a variety of prefabrication projects in every succeeding century. When integrated into large and repetitive spatial units through careful design, building systems can produce an architecture of the first order, as evidenced in the award winning design of…

  8. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  9. Geographical Information Systems: A Tool for Institutional Research.

    Science.gov (United States)

    Prather, James E.; Carlson, Christina E.

    This paper addresses the application of Geographical Information Systems (GIS), a computerized tool for associating key information by geographical location, to the institutional research function at institutions of higher education. The first section investigates the potential of GIS as an analytical and planning tool for institutional…

  10. Bayesian data analysis tools for atomic physics

    CERN Document Server

    Trassinelli, Martino

    2016-01-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distrib...

  11. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  12. Structure Design and Analysis of Spindle System of Special Double-tool Vertical Lathe%专用双刀立式车床主轴系统结构设计及分析

    Institute of Scientific and Technical Information of China (English)

    应富强; 李良艺; 吴灵东; 汪意

    2011-01-01

    对加工法兰盘的专用双刀立式车床主轴系统的布置进行分析,根据双刀立式车床的受力特点,以及盘类零件的加工特点,给出合理的轴段布置及轴承布置.基于有限元的方法对主轴一般受力情况进行力学分析,得出车床主轴设计时首先应该满足设计要求.对于双刀立式车床主轴的设计给出一般性的思路.%The arrangement of the spindle system of special double-tool vertical lathe for machining of flanges is analyzed Based on the mechanical characteristics of the double-tool vertical lathe and the machining features of disc parts, reasonable shaft segment layout and bearing arrangement are provided. The mechanical analysis for stress on the spindle is carried out with the aid of finite element method and the conclusion is that design of lathe spindle should meet the design requirement first.The general ideas about the design of spindle of double-tool vertical lathe are presented.

  13. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  14. PSEE: A Tool for Parallel Systems Learning

    OpenAIRE

    E. Luque; R. Suppi; Sorribes, J.; E. Cesar; J. Falguera; Serrano, M.

    2012-01-01

    Programs for parallel computers of distributed memory are difficult to write, understand, evaluate and debug. The design and performance evaluation of algorithms is much more complex than the conventional sequential one. The technical know/how necessary for the implementation of parallel systems is already available, but a critical problem is in the handling of complexity. In parallel distributed memory systems the performance is highly influenced by factors as interconnection scheme, granula...

  15. Comparison of emerging diagnostic tools for large commercial HVAC systems

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, Hannah; Piette, Mary Ann

    2001-04-06

    Diagnostic software tools for large commercial buildings are being developed to help detect and diagnose energy and other performance problems with building operations. These software applications utilize energy management control system (EMCS) trend log data. Due to the recent development of diagnostic tools, there has been little detailed comparison among the tools and a limited awareness of tool capabilities by potential users. Today, these diagnostic tools focus mainly on air handlers, but the opportunity exists for broadening the scope of the tools to include all major parts of heating, cooling, and ventilation systems in more detail. This paper compares several tools in the following areas: (1) Scope, intent, and background; (2) Data acquisition, pre-processing, and management; (3) Problems detected; (4) Raw data visualization; (5) Manual and automated diagnostic methods and (6) Level of automation. This comparison is intended to provide practitioners and researchers with a picture of the current state of diagnostic tools. There is tremendous potential for these tools to help improve commercial building energy and non-energy performance.

  16. Theoretical analysis tools in building business competitiveness

    Directory of Open Access Journals (Sweden)

    Yeisson Diego Tamayo

    2015-12-01

    Full Text Available Due to the internationalization of markets from firms free trade agreements and the standardization process in Colombia companies increasingly seek to satisfy the needs of their customers, so the business management systems take a heavy weight business development phases. In Colombia management systems have a boom in the financial system, so much so that there is manual quality of financial supervision of Colombia, but at the microeconomic level firms have not developed or at least there is no evidence of development that topic. Therefore it is necessary to analyze models of business management at international level in order to identify which elements or strategies can be applied by stages of business development, based on the measurement of indicator variables compliance management department in context Colombian.

  17. A Geographic and Functional Network Flow Analysis Tool

    Science.gov (United States)

    2014-06-01

    INFORMATION SYSTEMS TOOLS AND DYSTOPIA .......................................................................................................5 B. MODELS OF...17 IV. CASE STUDY: FIBER OPTIC COMMUNICATIONS BACKBONE IN DYSTOPIA ...15 Figure 8. A simple fiber-optic backbone network for Dystopia . .....................................19 Figure 9

  18. Giyoo Hatano's Analysis of Psychological Tools

    Science.gov (United States)

    Cole, Michael

    2007-01-01

    This paper focuses on the relation between two areas of research which owe a great debt to the work of Giyoo Hatano: the ways in which the use of the abacus mediates arithmetic problem solving and the way in which the use of the kanji writing system mediates the interpretation of unfamiliar words. These examples are related to L. S. Vygotsky's…

  19. EXPERT SYSTEMS - DEVELOPMENT OF AGRICULTURAL INSURANCE TOOL

    Directory of Open Access Journals (Sweden)

    NAN Anca-Petruţa

    2013-07-01

    Full Text Available Because of the fact that specialty agricultural assistance is not always available when the farmers need it, we identified expert systems as a strong instrument with an extended potential in agriculture. This started to grow in scale recently, including all socially-economic activity fields, having the role of collecting data regarding different aspects from human experts with the purpose of assisting the user in the necessary steps for solving problems, at the performance level of the expert, making his acquired knowledge and experience available. We opted for a general presentation of the expert systems as well as their necessity, because, the solution to develop the agricultural system can come from artificial intelligence by implementing the expert systems in the field of agricultural insurance, promoting existing insurance products, farmers finding options in depending on their necessities and possibilities. The objective of this article consists of collecting data about different aspects about specific areas of interest of agricultural insurance, preparing the database, a conceptual presentation of a pilot version which will become constantly richer depending on the answers received from agricultural producers, with the clearest exposure of knowledgebase possible. We can justify picking this theme with the fact that even while agricultural insurance plays a very important role in agricultural development, the registered result got from them are modest, reason why solutions need to be found in the scope of developing the agricultural sector. The importance of this consists in the proposal of an immediate viable solution to correspond with the current necessities of agricultural producers and in the proposal of an innovative solution, namely the implementation of expert system in agricultural insurance as a way of promoting insurance products. Our research, even though it treats the subject at an conceptual level, it wants to undertake an

  20. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  1. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  2. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  3. Hybrid wind-PV-diesel system sizing tool development using empirical approach, life-cycle cost and performance analysis: A case study in Scotland

    OpenAIRE

    Gan, Leong Kit; Shek, Jonathan; Mueller, Markus

    2015-01-01

    The concept of off-grid hybrid wind energy system is financially attractive and more reliable than stand-alone power systems since it is based on more than one electricity generation source. One of the most expensive components in a stand-alone wind-power system is the energy storage system as very often it is oversized to increase system autonomy. In this work, we consider a hybrid system which consists of wind turbines, photovoltaic panels, diesel generator and battery storage. One of the m...

  4. 数控机床伺服进给系统状态监测方案的分析比较%Comparing and Analysis of Schemes of Condition Monitoring for CNC Machine Tools Servo Feed System

    Institute of Scientific and Technical Information of China (English)

    韩军; 常瑞丽

    2014-01-01

    为了得到准确获取机床状态监测信号的方法,研究了机床状态信号获取的3种方案,即外置传感器信号、内置传感器信号和伺服驱动器监测端口信号。这3种信号都是可靠的信息来源,但是信号获取的难度和信号质量各有优劣。分析比较了这3种方案,分析结果对研究数控机床伺服进给系统在线监测技术、提高机床可靠性和加工质量具有参考意义。%In order to get methods of accurately obtaining machine tools condition monitoring signals,the three schemes of obtai-ning machine tools condition signals were researched,which included external setting sensor signals,internal setting sensor signals and servo drivers monitoring interface signals. The three signals were all reliable signal sources,however,the quality and difficulty level to obtain of signals were different. These three schemes were compared and analyzed. The analysis result has reference significance for re-searching the monitoring technology on-line of CNC machine tools servo feed system and improving reliability and machining quality of machine tools.

  5. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  6. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  7. Analyzing Real-Time Systems: Theory and Tools

    DEFF Research Database (Denmark)

    Hune, Thomas Seidelin

    actions take place, but also the timing of the actions. The formal reasoning presented here is based on (extensions of) the model of timed automata and tools supporting this model, mainly UPPAAL. Real-time systems are often part of safety critical systems e.g. control systems for planes, trains......The main topic of this dissertation is the development and use of methods for formal reasoning about the correctness of real-time systems, in particular methods and tools to handle new classes of problems. In real-time systems the correctness of the system does not only depend on the order in which......, or factories, though also everyday electronics as audio/video equipment and (mobile) phones are considered real-time systems. Often these systems are concurrent systems with a number of components interacting, and reasoning about such systems is notoriously difficult. However, since most of the systems...

  8. Integrated analysis environment for high impact systems

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, M. [Sandia National Labs., Albuquerque, NM (United States); Davis, J.; Scott, J.; Sztipanovits, J.; Karsai, G. [Vanderbilt Univ., Nashville, TN (United States). Measurement and Computing Systems Lab.

    1998-02-01

    Modeling and analysis of high consequence, high assurance systems requires special modeling considerations. System safety and reliability information must be captured in the models. Previously, high consequence systems were modeled using separate, disjoint models for safety, reliability, and security. The MultiGraph Architecture facilitates the implementation of a model integrated system for modeling and analysis of high assurance systems. Model integrated computing allows an integrated modeling technique to be applied to high consequence systems. Among the tools used for analyzing safety and reliability are a behavioral simulator and an automatic fault tree generation and analysis tool. Symbolic model checking techniques are used to efficiently investigate the system models. A method for converting finite state machine models to ordered binary decision diagrams allows the application of symbolic model checking routines to the integrated system models. This integrated approach to modeling and analysis of high consequence systems ensures consistency between the models and the different analysis tools.

  9. EXPERT SYSTEMS - DEVELOPMENT OF AGRICULTURAL INSURANCE TOOL

    OpenAIRE

    2013-01-01

    Because of the fact that specialty agricultural assistance is not always available when the farmers need it, we identified expert systems as a strong instrument with an extended potential in agriculture. This started to grow in scale recently, including all socially-economic activity fields, having the role of collecting data regarding different aspects from human experts with the purpose of assisting the user in the necessary steps for solving problems, at the performance level of the expert...

  10. Analysis of Facial Injuries Caused by Power Tools.

    Science.gov (United States)

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  11. Communication and control tools, systems, and new dimensions

    CERN Document Server

    MacDougall, Robert; Cummings, Kevin

    2015-01-01

    Communication and Control: Tools, Systems, and New Dimensions advocates a systems view of human communication in a time of intelligent, learning machines. This edited collection sheds new light on things as mundane yet still profoundly consequential (and seemingly "low-tech") today as push buttons, pagers and telemarketing systems. Contributors also investigate aspects of "remote control" related to education, organizational design, artificial intelligence, cyberwarfa

  12. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  13. Serial concept maps: tools for concept analysis.

    Science.gov (United States)

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments.

  14. Metabolic engineering with systems biology tools to optimize production of prokaryotic secondary metabolites

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Charusanti, Pep; Lee, Sang Yup;

    2016-01-01

    for the optimal production of various prokaryotic secondary metabolites: native versus heterologous hosts (e.g., Escherichia coli) and rational versus random approaches. This comparative analysis is followed by discussions on systems biology tools deployed in optimizing the production of secondary metabolites....... The potential contributions of additional systems biology tools are also discussed in the context of current challenges encountered during optimization of secondary metabolite production....

  15. Insider Threat Assessment: Model, Analysis and Tool

    Science.gov (United States)

    Chinchani, Ramkumar; Ha, Duc; Iyer, Anusha; Ngo, Hung Q.; Upadhyaya, Shambhu

    Insider threat is typically attributed to legitimate users who maliciously leverage their system privileges, and familiarity and proximity to their computational environment to compromise valuable information or inflict damage. According to the annual CSI/FBI surveys conducted since 1996, internal attacks and insider abuse form a significant portion of reported incidents. The strongest indication yet that insider threat is very real is given by the recent study [2] jointly conducted by CERT and the US Secret Service; the first of its kind, which provides an in-depth insight into the problem in a real-world setting. However, there is no known body of work which addresses this problem effectively. There are several challenges, beginning with understanding the threat.

  16. Nanocoatings for High-Efficiency Industrial and Tooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Blau, P; Qu, J.; Higdon, C. (Eaton Corporation)

    2011-02-01

    tests on process variants and developed tests to better simulate the applications of interest. ORNL also employed existing lubrication models to better understand hydraulic pump frictional behavior and test results. Phase III, “Functional Testing” focused on finalizing the strategy for commercialization of AlMgB14 coatings for both hydraulic and tooling systems. ORNL continued to provide tribology testing and analysis support for hydraulic pump applications. It included both laboratory-scale coupon testing and the analysis of friction and wear data from full component-level tests performed at Eaton Corp. Laboratory-scale tribology test methods are used to characterize the behavior of nanocomposite coatings prior to running them in full-sized hydraulic pumps. This task also includes developing tribosystems analyses, both to provide a better understanding of the performance of coated surfaces in alternate hydraulic fluids, and to help design useful laboratory protocols. Analysis also includes modeling the lubrication conditions and identifying the physical processes by which wear and friction of the contact interface changes over time. This final report summarizes ORNL’s portion of the nanocomposite coatings development effort and presents both generated data and the analyses that were used in the course of this effort.

  17. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  18. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  19. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  20. Configuration Design of Product Service System for CNC Machine Tools

    OpenAIRE

    Zhongqi Sheng; Tao Xu; Junyou Song

    2015-01-01

    Product service system is a combination of product and services to meet the customer requirements. Configuration design is the key process of product service system development. This paper studies the configuration design of product service system for CNC machine tools. The research explores the configuration design process of product service system development, analyzes the retrieval method of product service system schemes, determines the configuration sequence of product modules and servic...

  1. Single-cell analysis tools for drug discovery and development.

    Science.gov (United States)

    Heath, James R; Ribas, Antoni; Mischel, Paul S

    2016-03-01

    The genetic, functional or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development. Such heterogeneity hinders the design of accurate disease models and can confound the interpretation of biomarker levels and of patient responses to specific therapies. The complex nature of virtually all tissues has motivated the development of tools for single-cell genomic, transcriptomic and multiplex proteomic analyses. Here, we review these tools and assess their advantages and limitations. Emerging applications of single cell analysis tools in drug discovery and development, particularly in the field of oncology, are discussed.

  2. Survey of Human Systems Integration (HSI) Tools for USCG Acquisitions

    Science.gov (United States)

    2009-04-01

    libertymmhtables.libertymutual.com/CM_LMTablesWeb/ pdf /LibertyMutualTables.pdf Liberty Mutual (2004). Manual Materials Handling Guidelines. Survey of HSI Tools for USCG...cogn-feb20. pdf https://dspace.ucalgary.ca/bitstream/1880/46646/1/2008-904-17. pdf http://human-factors.arc.nasa.gov/ihi/research_groups/air-ground...Complete analysis utilities and user interface are included with the tool. Availability, Cost, and Contact Information: Siemens PLM Software

  3. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  4. GATA: a graphic alignment tool for comparative sequence analysis

    Directory of Open Access Journals (Sweden)

    Nix David A

    2005-01-01

    Full Text Available Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dot plot analysis is often used to estimate non-coding sequence relatedness. Yet dot plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments. Results To address some of these issues, we created a stand alone, platform independent, graphic alignment tool for comparative sequence analysis (GATA http://gata.sourceforge.net/. GATA uses the NCBI-BLASTN program and extensive post-processing to identify all small sub-alignments above a low cut-off score. These are graphed as two shaded boxes, one for each sequence, connected by a line using the coordinate system of their parent sequence. Shading and colour are used to indicate score and orientation. A variety of options exist for querying, modifying and retrieving conserved sequence elements. Extensive gene annotation can be added to both sequences using a standardized General Feature Format (GFF file. Conclusions GATA uses the NCBI-BLASTN program in conjunction with post-processing to exhaustively align two DNA

  5. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  6. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  7. Technology Assessment Tool - An Application of Systems Engineering to USDOE Technology Proposals

    Energy Technology Data Exchange (ETDEWEB)

    Rynearson, Michael Ardel

    1999-06-01

    This paper discusses the system design for a Technology Assessment (TA) tool that can be used to quantitatively evaluate new and advanced technologies, products, or processes. Key features of the tool include organization of information in an indentured hierarchy; questions and categories derived from the decomposition of technology performance; segregation of life-cycle issues into six assessment categories; and scoring, relative impact, and sensitivity analysis capability. An advantage of the tool's use is its ability to provide decision analysis data, based on incomplete or complete data.

  8. Technology Assessment Tool - An Application of Systems Engineering to USDOE Technology Proposals

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Rynearson

    1999-06-01

    This paper discusses the system design of a Technology Assessment (TA) tool that can be used to quantitatively evaluate new and advanced technologies, products, or processes. Key features of the tool include organization of information in an indentured hierarchy; questions and categories derived from the decomposition of technology performance; segregation of life-cycle issues into six assessment categories; and scoring, relative impact, and sensitivity analysis capability. An advantage of the tool's use is its ability to provide decision analysis data, based on incomplete or complete data.

  9. A tool for stability and power sharing analysis of a generalized class of droop controllers for high-voltage direct-current transmission systems

    OpenAIRE

    2016-01-01

    The problem of primary control of high-voltage direct current transmission systems is addressed in this paper, which contains four main contributions. First, to propose a new nonlinear, more realistic, model for the system suitable for primary control design, which takes into account nonlinearities introduced by conventional inner controllers. Second, to determine necessary conditions---dependent on some free controller tuning parameters---for the existence of equilibria. Third, to formulate ...

  10. Eating tools in hand activate the brain systems for eating action: a transcranial magnetic stimulation study.

    Science.gov (United States)

    Yamaguchi, Kaori; Nakamura, Kimihiro; Oga, Tatsuhide; Nakajima, Yasoichi

    2014-07-01

    There is increasing neuroimaging evidence suggesting that visually presented tools automatically activate the human sensorimotor system coding learned motor actions relevant to the visual stimuli. Such crossmodal activation may reflect a general functional property of the human motor memory and thus can be operating in other, non-limb effector organs, such as the orofacial system involved in eating. In the present study, we predicted that somatosensory signals produced by eating tools in hand covertly activate the neuromuscular systems involved in eating action. In Experiments 1 and 2, we measured motor evoked response (MEP) of the masseter muscle in normal humans to examine the possible impact of tools in hand (chopsticks and scissors) on the neuromuscular systems during the observation of food stimuli. We found that eating tools (chopsticks) enhanced the masseter MEPs more greatly than other tools (scissors) during the visual recognition of food, although this covert change in motor excitability was not detectable at the behavioral level. In Experiment 3, we further observed that chopsticks overall increased MEPs more greatly than scissors and this tool-driven increase of MEPs was greater when participants viewed food stimuli than when they viewed non-food stimuli. A joint analysis of the three experiments confirmed a significant impact of eating tools on the masseter MEPs during food recognition. Taken together, these results suggest that eating tools in hand exert a category-specific impact on the neuromuscular system for eating.

  11. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  12. Designing budgeting tool and sensitivity analysis for a start-up. : Case: Witrafi Oy

    OpenAIRE

    Arafath, Muhammad

    2014-01-01

    This study presents a thesis on the topic of designing budgeting tool and sensitivi-ty analysis for the commissioning company. The commissioning company is a Finnish Star-up and currently focusing on developing Intelligent Transport Sys-tems by using network based parking system. The aim of this thesis is to provide a ready-made budgeting tool therefore, the commissioning company can use the tool for its own purpose. This is a product-oriented thesis and it includes five project tasks. Pro...

  13. IT-tools for Mechatronic System Engineering and Design

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben; Andersen, T. O.

    2003-01-01

    the Esprit project SWING on IT-tools for rapid prototyping of fluid power components and systems. A mechatronic test facility for a hydraulic robot and a CNC XY-machine table was implemented. The controller applies digital signal processors (DSPs). The DSP controller utilizes the dSPACE System suitable...... and manufacturing IT- concept feasible and commercially attractive. An IT-tool concept for modelling, simulation and design of mechatronic products and systems is proposed in this paper. It built on results from a Danish mechatronic research program focusing on intelligent motion control as well as results from...

  14. Topics in expert system design methodologies and tools

    CERN Document Server

    Tasso, C

    1989-01-01

    Expert Systems are so far the most promising achievement of artificial intelligence research. Decision making, planning, design, control, supervision and diagnosis are areas where they are showing great potential. However, the establishment of expert system technology and its actual industrial impact are still limited by the lack of a sound, general and reliable design and construction methodology.This book has a dual purpose: to offer concrete guidelines and tools to the designers of expert systems, and to promote basic and applied research on methodologies and tools. It is a coordinated coll

  15. DFTCalc: reliability centered maintenance via fault tree analysis (tool paper)

    NARCIS (Netherlands)

    Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha

    2015-01-01

    Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.

  16. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  17. Computer System Reliability Allocation Method and Supporting Tool

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper presents a computer system reliability allocationmethod that is based on the theory of statistic and Markovian chain,which can be used to allocate reliability to subsystem, to hybrid system and software modules. Arele vant supporting tool built by us is introduced.

  18. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid;

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or speed...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  19. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    Science.gov (United States)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  20. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  1. A Case Study for Tooling the Design Trajectory of Embedded Control Systems

    NARCIS (Netherlands)

    Jovanovic, Dusko; Hilderink, Gerald H.; Broenink, Jan F.; Schweizer, M.

    2002-01-01

    We strive to allow a mechatronic system designer the power of designing mechatronic systems in manners of concurrent engineering in short time at a fraction of the present day costs. In our context, this means a methodology and tool support to stepwise refinement in analysis and design of the plant

  2. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  3. DESIGN AND CAD SYSTEM OF THE TOOL FOR DRILL FLUTE

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the principles of differential geometry and kinematics, a mathematical model is developed to describe the grinding wheel axial cross-section with the radial cross-section of the flute in a given drill under the basic engagement condition between the generating flute and the generated grinding wheel (or disk milling tool). The mathematical model is good for the flute in the radial cross-section consisting of three arcs. Furthermore, a CAD system is also developed to represent the axial cross-section of the grinding wheel (or disk milling tool). With the system, the grinding wheel (or disk milling tool) axial cross-section that corresponds to the three-arc flute cross section can be conveniently simulated. Through the grinding experiment of drill flutes, the method and the CAD system are proved to be feasible and reasonable.

  4. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  5. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    Science.gov (United States)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  6. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  7. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    . Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  8. A Automated Tool for Supporting FMEAs of Digital Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.; Lehner, J.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach, an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.

  9. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    Science.gov (United States)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  10. Multilingual lexicon design tool and database management system for MT

    CERN Document Server

    Barisevičius, G

    2011-01-01

    The paper presents the design and development of English-Lithuanian-English dictionarylexicon tool and lexicon database management system for MT. The system is oriented to support two main requirements: to be open to the user and to describe much more attributes of speech parts as a regular dictionary that are required for the MT. Programming language Java and database management system MySql is used to implement the designing tool and lexicon database respectively. This solution allows easily deploying this system in the Internet. The system is able to run on various OS such as: Windows, Linux, Mac and other OS where Java Virtual Machine is supported. Since the modern lexicon database managing system is used, it is not a problem accessing the same database for several users.

  11. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  12. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  13. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    Although software usability has long been emphasized, there is a lot of software with poor usability. In Usability Engineering, usability professionals prescribe a classical usability approach to improving software usability. It is essential to prototype and usability test user interfaces before....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...... system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...

  14. Smart adaptronic hydrostatic guiding system for machine tool slides

    Science.gov (United States)

    Munzinger, C.; Weis, M.; Herder, S.

    2009-03-01

    Guiding systems figure amongst the central components in the flux of a machine tool. Their characteristics have a direct impact on machining accuracy. Hydrostatic guiding systems are preferably used when specific requirements are to be met with regards to accuracy, stiffness and damping. However, an active intervention in the guiding system of such conventional systems, i.e. to absorb geometrical guiding rail errors, has so far not been possible. Compared to modular, conventional systems, adaptronic systems offer considerable cost savings potentials thanks to their increased functional degree of integration [1].

  15. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  16. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2002-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re

  17. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects...... and designers....

  18. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  19. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  20. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  1. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    Science.gov (United States)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  2. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  3. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  4. A Scheduling System Based on Rules of the Machine Tools in FMS

    Institute of Scientific and Technical Information of China (English)

    LI De-xin; ZHAO Hua-qun; JIA Jie; LU Yan-jun

    2003-01-01

    In this paper, a model of the scheduling of machine tools in the flexible manufacturing line is presented by intensive analysis and research of the mathematical method of traditional scheduling. The various factors correlative with machine tools in the flexible manufacturing line are fully considered in this system. Aiming at this model, an intelligent decision system based on rules and simulation technolo-gy integration is constructed by using the OOP ( Object-Orented Programming) method, and the simula-tion experiment analysis is carried out. It is shown from the results that the model is better in practice.

  5. Multidimensional analysis: a management tool for monitoring HIPAA compliance and departmental performance.

    Science.gov (United States)

    Coleman, Robert M; Ralston, Matthew D; Szafran, Alexander; Beaulieu, David M

    2004-09-01

    Most RIS and PACS systems include extensive auditing capabilities as part of their security model, but inspecting those audit logs to obtain useful information can be a daunting task. Manual analysis of audit trails, though cumbersome, is often resorted to because of the difficulty to construct queries to extract complex information from the audit logs. The approach proposed by the authors uses standard off-the-shelf multidimensional analysis software tools to assist the PACS/RIS administrator and/or security officer in analyzing those audit logs to identify and scrutinize suspicious events. Large amounts of data can be quickly reviewed and graphical analysis tools help explore system utilization. While additional efforts are required to fully satisfy the demands of the ever-increasing security and confidentiality pressures, multidimensional analysis tools are a practical step toward actually using the information that is already being captured in the systems' audit logs. In addition, once the work is performed to capture and manipulate the audit logs into a viable format for the multidimensional analysis tool, it is relatively easy to extend the system to incorporate other pertinent data, thereby enabling the ongoing analysis of other aspects of the department's workflow.

  6. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    Science.gov (United States)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  7. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  8. A thermodynamic evaluation of chilled water central air conditioning systems using artificial intelligence tools

    Directory of Open Access Journals (Sweden)

    Juan Carlos Armas

    2011-05-01

    Full Text Available  An analysis of a chilled water central air conditioning system is presented. The object was to calculate main cycle component irreversibility, as well as evaluating this indicator’s sensitivity to operational variations. Artificial neural networks (ANN, genetic algorithms (GA and Matlab tools were used to calculate refrigerant thermodynamic properties during each cycle stage. These tools interacted with equations describing the system’s thermodynamic behaviour. Refrigerant temperature, when released from the compressor, was determined by a hybrid model combining the neural model with a simple genetic algorithm used as optimisation tool; the cycle’s components which were most sensitive to changes in working conditions were identified. It was concluded that the compressor, evaporator and expansion mechanism (in that order represented significant exergy losses reaching 85.62% of total system irreversibility. A very useful tool was thus developed for evaluating these systems

  9. Toward the development of an image quality tool for active millimeter wave imaging systems

    Science.gov (United States)

    Barber, Jeffrey; Weatherall, James C.; Greca, Joseph; Smith, Barry T.

    2015-05-01

    Preliminary design considerations for an image quality tool to complement millimeter wave imaging systems are presented. The tool is planned for use in confirming operating parameters; confirmation of continuity for imaging component design changes, and analysis of new components and detection algorithms. Potential embodiments of an image quality tool may contain materials that mimic human skin in order to provide a realistic signal return for testing, which may also help reduce or eliminate the need for mock passengers for developmental testing. Two candidate materials, a dielectric liquid and an iron-loaded epoxy, have been identified and reflection measurements have been performed using laboratory systems in the range 18 - 40 GHz. Results show good agreement with both laboratory and literature data on human skin, particularly in the range of operation of two commercially available millimeter wave imaging systems. Issues related to the practical use of liquids and magnetic materials for image quality tools are discussed.

  10. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  11. Faculty Usage of Library Tools in a Learning Management System

    Science.gov (United States)

    Leeder, Chris; Lonn, Steven

    2014-01-01

    To better understand faculty attitudes and practices regarding usage of library-specific tools and roles in a university learning management system, log data for a period of three semesters was analyzed. Academic departments with highest rates of usage were identified, and faculty users and nonusers within those departments were surveyed regarding…

  12. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  13. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  14. An Intelligent Tool to support Requirements Analysis and Conceptual Design of Database Design

    Institute of Scientific and Technical Information of China (English)

    王能斌; 刘海青

    1991-01-01

    As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification language NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool,NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.

  15. Urban solid waste collection system using mathematical modelling and tools of geographic information systems.

    Science.gov (United States)

    Arribas, Claudia Andrea; Blazquez, Carola Alejandra; Lamas, Alejandro

    2010-04-01

    A poorly designed urban solid waste collection system has an enormous impact on labour, operational and transport costs, and on society in general due to road contamination and negative effects on public health and the environment. This study proposes a methodology for designing an urban solid waste collection system. This methodology uses combinatorial optimisation and integer programing, and GIS tools to minimise collection time, and operational and transport costs while enhancing the current solid waste collection system. This methodology establishes feasible collection routes, determines an adequate vehicle fleet size and presents a comparative cost and sensitivity analysis of the results. The implementation of this methodology in a study case of a zone in Santiago yields significant cost savings in the total collection system.

  16. Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel

    Science.gov (United States)

    McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.

    1999-01-01

    The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.

  17. Accuracy Analysis and Calibration of Gantry Hybrid Machine Tool

    Institute of Scientific and Technical Information of China (English)

    唐晓强; 李铁民; 尹文生; 汪劲松

    2003-01-01

    The kinematic accuracy is a key factor in the design of parallel or hybrid machine tools. This analysis improved the accuracy of a 4-DOF (degree of freedom) gantry hybrid machine tool based on a 3-DOF planar parallel manipulator by compensating for various positioning errors. The machine tool architecture was described with the inverse kinematic solution. The control parameter error model was used to analyze the accuracy of the 3-DOF planar parallel manipulator and to develop a kinematic calibration method. The experimental results prove that the calibration method reduces the cutter nose errors from ±0.50 mm to ±0.03 mm for a horizontal movement of 600 mm by compensating for errors in the slider home position, the guide way distance and the extensible strut home position. The calibration method will be useful for similar types of parallel kinematic machines.

  18. NC flame pipe cutting machine tool based on open architecture CNC system

    Institute of Scientific and Technical Information of China (English)

    Xiaogen NIE; Yanbing LIU

    2009-01-01

    Based on the analysis of the principle and flame movement of a pipe cutting machine tool, a retrofit NC flame pipe cutting machine tool (NFPCM) that can meet the demands of cutting various pipes is proposed. The paper deals with the design and implementation of an open architecture CNC system for the NFPCM, many of whose aspects are similar to milling machines; however, different from their machining processes and control strategies. The paper emphasizes on the NC system structure and the method for directly creating the NC file according to the cutting type and parameters. Further, the paper develops the program and sets up the open and module NC system.

  19. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... of the framework. The issue of commercial simulators or software providing the necessary features for product-process synthesis-design as opposed to their development by the academic PSE community will also be discussed. An example of a successful collaboration between academia-industry for the development...

  20. AUTOMATIC TOOL-CHANGING WITHIN THE RECONFIGURABLE MANUFACTURING SYSTEMS PARADIGM

    Directory of Open Access Journals (Sweden)

    J.E.T. Collins

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Reconfigurable manufacturing systems were developed as a proposed solution to the varying market and customer requirements present in today’s global market. The systems are designed to offer adaptability in machining functions and processes. This adaptive capability requires access to a selection of tools. The development of reconfigurable manufacturing systems has mainly been focused on the machine tools themselves. Methods of supplying tools to these machines need to be researched. This paper does so, presenting a tool-changing unit that offers a solution to this need. It then discusses the enabling technologies that would allow for automatic integration and diagnostic abilities of the unit.

    AFRIKAANSE OPSOMMING: Herkonfigureerbare vervaardingstelsels is ontwikkel as ’n voorgestelde oplossing vir die varierende mark- en klantbehoeftes in die hedendaagse globale mark. Die stelsels is ontwikkel om aanpasbaarheid te bied ten opsigte van masjineringsfunksies en –prosesse. Hierdie aanpasbare vermoëns vereis egter toegang tot ‘n verskeidenheid van gereedskapstukke. Die ontwikkeling van herkonfigureerbare vervaardigingstelsels het egter hoofsaaklik gefokus op die gereedskapstukke. Die wyse waarop hierdie gereedskapstukke beskikbaar gestel word aan die masjinerie moet egter nagevors word. Hierdie artikel doen juis dit en stel ‘n eenheid voor vir die ruiling van gereedskapstukke. Voorts word die tegnologieë bespreek wat automatiese integrasie moontlik maak en diagnostiese vermoëns verskaf.

  1. Development and maintenance of product configuration systems: Requirements for a documentation tool

    DEFF Research Database (Denmark)

    Hvam, Lars; Christensen, Simon Pape; Jensen, Klaes Ladeby

    2005-01-01

    and structured by using object-oriented system development techniques based on an analysis of the existing product configuration processes of the companies. The analysis was based on the procedure for building configuration systems as developed at CPM, and revealed several common requirements within......Product configuration systems are increasingly applied to automate the configuration of complex products. A configuration system is an expert system designed to combine specified modules according to constraints. The constraints are stored as product data and rules in a product model, and one...... the different companies. Significance: There is an actual need for a documentation tool for product configuration systems. The objective of the study is to identify and capture requirements of such a documentation tool, and serve as a basis for future design and implementation....

  2. AstroStat - A VO Tool for Statistical Analysis

    CERN Document Server

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  3. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  4. TMVA - Tool-kit for Multivariate Data Analysis in ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Therhaag, Jan; Von Toerne, Eckhard [Univ. Bonn, Physikalisches Institut, Nussallee 12, 53115 Bonn (Germany); Hoecker, Andreas; Speckmayer, Peter [European Organization for Nuclear Research - CERN, CH-1211 Geneve 23 (Switzerland); Stelzer, Joerg [Deutsches Elektronen-Synchrotron - DESY, Platanenallee 6, D-15738 Zeuthen (Germany); Voss, Helge [Max-Planck-Institut fuer Kernphysik - MPI, Postfach 10 39 80, Saupfercheckweg 1, DE-69117 Heidelberg (Germany)

    2010-07-01

    Given the ever-increasing complexity of modern HEP data analysis, multivariate analysis techniques have proven an indispensable tool in extracting the most valuable information from the data. TMVA, the Tool-kit for Multivariate Data Analysis, provides a large variety of advanced multivariate analysis techniques for both signal/background classification and regression problems. In TMVA, all methods are embedded in a user-friendly framework capable of handling the pre-processing of the data as well as the evaluation of the results, thus allowing for a simple use of even the most sophisticated multivariate techniques. Convenient assessment and comparison of different analysis techniques enable the user to choose the most efficient approach for any particular data analysis task. TMVA is an integral part of the ROOT data analysis framework and is widely-used in the LHC experiments. In this talk I will review recent developments in TMVA, discuss typical use-cases in HEP and present the performance of our most important multivariate techniques on example data by comparing it to theoretical performance limits. (authors)

  5. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  6. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  7. Film analysis systems and applications

    Energy Technology Data Exchange (ETDEWEB)

    Yonekura, Y.; Brill, A.B.

    1981-01-01

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool. (ERB)

  8. Automatic Tool Path Generation for Robot Integrated Surface Sculpturing System

    Science.gov (United States)

    Zhu, Jiang; Suzuki, Ryo; Tanaka, Tomohisa; Saito, Yoshio

    In this paper, a surface sculpturing system based on 8-axis robot is proposed, the CAD/CAM software and tool path generation algorithm for this sculpturing system are presented. The 8-axis robot is composed of a 6-axis manipulator and a 2-axis worktable, it carves block of polystyrene foams by heated cutting tools. Multi-DOF (Degree of Freedom) robot benefits from the faster fashion than traditional RP (Rapid Prototyping) methods and more flexibility than CNC machining. With its flexibility driven from an 8-axis configuration, as well as efficient custom-developed software for rough cutting and finish cutting, this surface sculpturing system can carve sculptured surface accurately and efficiently.

  9. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  10. Geographic Information System Tools for Conservation Planning: User's Manual

    Science.gov (United States)

    Fox, Timothy J.; Rohweder, Jason J.; Kenow, K.P.; Korschgen, C.E.; DeHaan, H.C.

    2003-01-01

    Public and private land managers desire better ways to incorporate landscape, species, and habitat relations into their conservation planning processes. We present three tools, developed for the Environmental Systems Research Institute?s ArcView 3.x platform, applicable to many types of wildlife conservation management and planning efforts. These tools provide managers and planners with the ability to rapidly assess landscape attributes and link these attributes with species-habitat information. To use the tools, the user provides a detailed land cover spatial database and develops a matrix to identify species-habitat relations for the landscape of interest. The tools are applicable to any taxa or suite of taxa for which the required data are available. The user also has the ability to interactively make polygon-specific changes to the landscape and re-examine species-habitat relations. The development of these tools has given resource managers the means to evaluate the merits of proposed landscape management scenarios and to choose the scenario that best fits the goals of the managed area.

  11. 系统功能语法理论—批评话语分析的有力工具%Systemic Functional Grammar-A Powerful Tool for Critical Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    张杰

    2012-01-01

    As the most prominent part of discourse analysis, Critical Discourse Analysis (CDA) has its roots in the critical linguistics in 1970s. Systemic Functional Grammar (SFG) has been considered an important approach and theoretical basis to CDA and is widely used in the critical analysis of various types of discourse. Only through a further dialogue between these two different, but strongly linked approaches to language and its uses, can they have greater development. This paper starts from the three most important elements of SFG, combined with examples, to discuss its reason and application to perform as a tool of CDA.%批评话语分析起源于20世纪70年代的批评语言学,是语篇分析的重要组成部分。系统功能语法一直被认为是进行批评话语分析的重要方法和理论依据,广泛应用于各类语篇的批评分析中。作为两个不同却又紧密联系的对语言和语言使用的描述方法,只有在进一步对话中才能有更大的发展。文章拟从分析系统功能语法的三大重要组成要素入手,解释其成为批评话语分析理论基础和方法论源泉的原因,并结合实例说明其对批评话语分析的应用性。

  12. Physics Analysis Tools for the CMS experiment at LHC

    CERN Document Server

    Fabozzi, Francesco; Hegner, Benedikt; Lista, Luca

    2008-01-01

    The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.

  13. Architecture of Network Management Tools for Heterogeneous System

    CERN Document Server

    Hassan, Rosilah; Mohseni, Shima; Mohamad, Ola; Ismail, Zahian

    2010-01-01

    Managing heterogeneous network systems is a difficult task because each of these networks has its own curious management system. These networks usually are constructed on independent management protocols which are not compatible with each other. This results in the coexistence of many management systems with different managing functions and services across enterprises. Incompatibility of different management systems makes management of whole system a very complex and often complicated job. Ideally, it is necessary to implement centralized metalevel management across distributed heterogeneous systems and their underlying supporting network systems where the information flow and guidance is provided via a single console or single operating panels which integrates all the management functions in spite of their individual protocols and structures. This paper attempts to provide a novel network management tool architecture which supports heterogeneous managements across many different architectural platforms. Furt...

  14. Statistical tools for prognostics and health management of complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Collins, David H [Los Alamos National Laboratory; Huzurbazar, Aparna V [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory

    2010-01-01

    Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.

  15. COMPARISON OF MALAYSIA MANUFACTURING COMPANIES BY FINANCIAL STATEMENT ANALYSIS TOOLS

    OpenAIRE

    MALEK, Afagh; Mohammadi, Maryam; NASSIRI, Fardokht

    2012-01-01

    One of the best ways to get the expected results from trading in the stock market is to acquire a good evaluation of companies’ performance. Similarly, this study aims at comparing the financial performance of Lb Aluminium Berhad and Seal Incorporated Berhad manufacturing companies, which are listed in the main market of Malaysian stock exchange. The data were gathered from the annual reports of companies during last three years and analysed by financial statement analysis tools, which are ...

  16. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  17. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  18. A theorem prover-based analysis tool for object-oriented databases

    NARCIS (Netherlands)

    Spelt, D.; Even, S.J.

    1999-01-01

    We present a theorem-prover based analysis tool for object-oriented database systems with integrity constraints. Object-oriented database specifications are mapped to higher-order logic (HOL). This allows us to reason about the semantics of database operations using a mechanical theorem prover such

  19. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  20. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  1. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  2. Development of meso-scale milling machine tool and its performance analysis

    Institute of Scientific and Technical Information of China (English)

    Hongtao LI; Xinmin LAI; Chengfeng LI; Zhongqin LIN; Jiancheng MIAO; Jun NI

    2008-01-01

    To overcome the shortcomings of current technologies for meso-scale manufacturing such as MEMS and ultra precision machining, this paper focuses on the investigations on the meso milling process with a miniaturized machine tool. First, the related technologies for the process mechanism studies are investigated based on the analysis of the characteristics of the meso milling process. An overview of the key issues is presented and research approaches are also proposed. Then, a meso-scale milling machine tool system is developed. The subsystems and their specifications are described in detail. Finally, some tests are conducted to evaluate the performance of the system. These tests consist of precision measurement of the positioning subsystem, the test for machining precision evaluation, and the experiments for machining mechanical parts with com-plex features. Through test analysis, the meso milling process with a miniaturized machine tool is proved to be feasible and applicable for meso manufacturing.

  3. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  4. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  5. Controlling open quantum systems: tools, achievements, and limitations

    Science.gov (United States)

    Koch, Christiane P.

    2016-06-01

    The advent of quantum devices, which exploit the two essential elements of quantum physics, coherence and entanglement, has sparked renewed interest in the control of open quantum systems. Successful implementations face the challenge of preserving relevant nonclassical features at the level of device operation. A major obstacle is decoherence, which is caused by interaction with the environment. Optimal control theory is a tool that can be used to identify control strategies in the presence of decoherence. Here we review recent advances in optimal control methodology that allow typical tasks in device operation for open quantum systems to be tackled and discuss examples of relaxation-optimized dynamics. Optimal control theory is also a useful tool to exploit the environment for control. We discuss examples and point out possible future extensions.

  6. Design Analysis and Experiment of High Speed CNC Machine Tool Feed System Driven by Liner Motor%直驱型高速数控机床进给系统设计分析及实验∗

    Institute of Scientific and Technical Information of China (English)

    万莉平; 张军; 郑天池; 鞠家全; 郭永海; 邱自学

    2015-01-01

    In order to overcome the problems of traditional feed drive way composed by spin motor com-bined with ball screw in machine tool, such as high elastic deformation, slow response speed and easy to wear , a high speed CNC machine tool feed system driven by liner motor was designed using permanent mag-net linear synchronous motor ( PMSM) . The influence of mechanical structure on the dynamic characteris-tics was analyzed theoretically. On the basis of this, the stiffness, strength and modal analysis of the main structure of the system were made under the working condition of the maximum cutting force. Finally, the position precision and dynamic balance precision of the feed system were tested. The results show that the deformation of the system structure is less than 0. 01 mm and the maximum stress is 5. 7MPa. And in low frequency range, a high anti vibration characteristics of the feed system can be obtained by avoiding 1, 2, 5, 6 order processing excitation frequency. At the same time, the system position precision and balance pre-cision both meet the design index, and dynamic precision and stability of main structure of the feed system are guaranteed effectively.%为克服传统机床“旋转电机+滚珠丝杠”进给传动方式存在的弹性变形大、响应速度慢、易磨损等问题,利用永磁直线同步电机( PMLSM)设计了一种直线电机驱动高速数控机床进给系统。从理论上分析了系统机械结构对动态特性的影响,在此基础上,对最大切削力工况下的进给系统主体结构进行了刚度、强度以及模态分析,最后,实验测试了进给系统的位置精度及动平衡精度。结果表明,该进给系统结构变形量小于0.01mm,最大应力为5.7MPa,且在低阶频率范围内,主动避开1、2、5、6阶加工激振频率可使进给系统具有较高的抗振特性,同时该系统的位置精度、平衡精度均在设计指标内,有效的保证了进给系统主体结构的动态精度及稳定性。

  7. Graphical tools for network meta-analysis in STATA.

    Science.gov (United States)

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  8. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  9. Use of Grid Tools to Support CMS Distributed Analysis

    CERN Document Server

    Fanfani, A; Anjum, A; Barrass, T; Bonacorsi, D; Bunn, J; Corvo, M; Darmenov, N; De Filippis, N; Donno, F; Donvito, G; Eulisse, G; Fanzago, F; Filine, A; Grandi, C; Hernández, J M; Innocente, V; Jan, A; Lacaprara, S; Legrand, I; Metson, S; Newman, H; Silvestris, L; Steenberg, C; Stockinger, H; Taylor, L; Thomas, M; Tuura, L; Van Lingen, F; Wildish, T

    2004-01-01

    In order to prepare the Physic Technical Design Report, due by end of 2005, the CMS experiment needs to simulate, reconstruct and anlayse about 100 million events, corresponding to more than 200 TB of data. The data will be distributed to several Computing Centres. In order to provide access to the whole data sample to all the world-wide dispersed physicists, CMS is developing a layer of software that uses the grid tools provided by the LCG project to gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in the framework of the ARDA project. This work describes the current status and the future development...

  10. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  11. The ALICE analysis train system

    CERN Document Server

    Zimmermann, Markus

    2015-01-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  12. Audio signal analysis for tool wear monitoring in sheet metal stamping

    Science.gov (United States)

    Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.

    2017-02-01

    Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.

  13. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  14. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  15. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  16. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  17. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  18. A Data Management System for International Space Station Simulation Tools

    Science.gov (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  19. Marketing residential grid-connected PV systems using a balanced scorecard as a marketing tool

    Energy Technology Data Exchange (ETDEWEB)

    Bach, N. [University of Giessen (Germany). Dept. of Organisation and Management; Calais, P. [Murdoch University, Perth (Australia). School of Environmental Science; Calais, M. [Curtin University, Perth (Australia). Centre for Renewable Systems Technology

    2001-03-01

    A strategic analysis of the electricity market in Western Australia yields a market potential for renewable energy in Western Australia. However, from a purely financial viewpoint the installation of grid-connected pv-systems still is not economically viable. In this paper a balanced scorecard (BSC) is developed to capture and visualize other than financial benefits. Therefore, the BSC can be used as a marketing tool to communicate the benefits of a privately owned GCPV system to potential customers. (author)

  20. Intelligent Electric Power Systems with Active-Adaptive Electric Networks: Challenges for Simulation Tools

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2015-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of intelligent electric power systems with active-adaptive electric networks (IES including Flexible Alternating Current Transmission System (FACTS devices. The key requirements for the simulation were formed. The presented analysis of simulation results of IES confirms the need to use a hybrid modelling approach.

  1. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  2. Basic statistical tools in research and data analysis

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  3. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  4. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  5. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  6. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  7. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  8. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  9. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  10. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  11. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Energy Technology Data Exchange (ETDEWEB)

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  12. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  13. The system overview tool of the Joint COntrols Project (JCOP) framework

    CERN Document Server

    González Berges, B M; Joshi, K; CERN. Geneva. IT Department

    2007-01-01

    For each control system of the Large Hadron Collider (LHC) experiments, there will be many processes spread over many computers. All together, they will form a distributed system with around 150 computers organized in a hierarchical fashion. A centralized tool has been developed for supervising, identification of errors and troubleshooting such a large system. A quick response to abnormal situations will be crucial to maximize the physics usage. This tool gathers data from all the systems via several paths (e.g., process monitors, internal database), and after some processing, presents it in different views. Correlations between the different views are provided to help to understand complex problems that involve more than one system. It is also possible to filter the information presented to the shift operator according to several criteria (e.g. node, process type, process state). Alarms are raised when undesired situations are found. The data gathered is stored in the historical archive for further analysis.

  14. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.

    Science.gov (United States)

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  15. Energy Signal Tool for Decision Support in Building Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Henze, G. P.; Pavlak, G. S.; Florita, A. R.; Dodier, R. H.; Hirsch, A. I.

    2014-12-01

    A prototype energy signal tool is demonstrated for operational whole-building and system-level energy use evaluation. The purpose of the tool is to give a summary of building energy use which allows a building operator to quickly distinguish normal and abnormal energy use. Toward that end, energy use status is displayed as a traffic light, which is a visual metaphor for energy use that is either substantially different from expected (red and yellow lights) or approximately the same as expected (green light). Which light to display for a given energy end use is determined by comparing expected to actual energy use. As expected, energy use is necessarily uncertain; we cannot choose the appropriate light with certainty. Instead, the energy signal tool chooses the light by minimizing the expected cost of displaying the wrong light. The expected energy use is represented by a probability distribution. Energy use is modeled by a low-order lumped parameter model. Uncertainty in energy use is quantified by a Monte Carlo exploration of the influence of model parameters on energy use. Distributions over model parameters are updated over time via Bayes' theorem. The simulation study was devised to assess whole-building energy signal accuracy in the presence of uncertainty and faults at the submetered level, which may lead to tradeoffs at the whole-building level that are not detectable without submetering.

  16. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database......-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model......, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. The main attention in the report is drawn to the modelling at the system level of the following wind turbine concepts: 1. Fixed speed active stall wind turbine concept 2. Variable...

  17. Generalized Aliasing as a Basis for Program Analysis Tools

    Science.gov (United States)

    2000-11-01

    applications are described in the next chapter, in Section 9.2.2.) For example, the Ladybug specification checker tool [44] has a user interface shell...any particular implementation of the interface. At run time, Ladybug uses reflection to load the engine class by name and create an object of that...supplied with Sun’s JDK 1.1.7 Jess Java Expert System Shell version 4.4, from Sandia National Labs [35] Ladybug The Ladybug specification checker, by Craig

  18. Disposal systems evaluations and tool development : Engineered Barrier System (EBS) evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Rutqvist, Jonny (LBNL); Liu, Hui-Hai (LBNL); Steefel, Carl I. (LBNL); Serrano de Caro, M. A. (LLNL); Caporuscio, Florie Andre (LANL); Birkholzer, Jens T. (LBNL); Blink, James A. (LLNL); Sutton, Mark A. (LLNL); Xu, Hongwu (LANL); Buscheck, Thomas A. (LLNL); Levy, Schon S. (LANL); Tsang, Chin-Fu (LBNL); Sonnenthal, Eric (LBNL); Halsey, William G. (LLNL); Jove-Colon, Carlos F.; Wolery, Thomas J. (LLNL)

    2011-01-01

    Key components of the nuclear fuel cycle are short-term storage and long-term disposal of nuclear waste. The latter encompasses the immobilization of used nuclear fuel (UNF) and radioactive waste streams generated by various phases of the nuclear fuel cycle, and the safe and permanent disposition of these waste forms in geological repository environments. The engineered barrier system (EBS) plays a very important role in the long-term isolation of nuclear waste in geological repository environments. EBS concepts and their interactions with the natural barrier are inherently important to the long-term performance assessment of the safety case where nuclear waste disposition needs to be evaluated for time periods of up to one million years. Making the safety case needed in the decision-making process for the recommendation and the eventual embracement of a disposal system concept requires a multi-faceted integration of knowledge and evidence-gathering to demonstrate the required confidence level in a deep geological disposal site and to evaluate long-term repository performance. The focus of this report is the following: (1) Evaluation of EBS in long-term disposal systems in deep geologic environments with emphasis on the multi-barrier concept; (2) Evaluation of key parameters in the characterization of EBS performance; (3) Identification of key knowledge gaps and uncertainties; and (4) Evaluation of tools and modeling approaches for EBS processes and performance. The above topics will be evaluated through the analysis of the following: (1) Overview of EBS concepts for various NW disposal systems; (2) Natural and man-made analogs, room chemistry, hydrochemistry of deep subsurface environments, and EBS material stability in near-field environments; (3) Reactive Transport and Coupled Thermal-Hydrological-Mechanical-Chemical (THMC) processes in EBS; and (4) Thermal analysis toolkit, metallic barrier degradation mode survey, and development of a Disposal Systems

  19. A Forced Jet System for the Cooling of Cutting Tools.

    Science.gov (United States)

    Cutting tools , *Coolant pumps, *Machine tools, *Metals, Machine shop practice, High pressure, Force(Mechanics), Centrifugal pumps, Mist, Jet streams, Lubricants, Machining, Friction, Surface finishing, Safety

  20. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  1. PyRAT (python radiography analysis tool): overview

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  2. Stacks: an analysis tool set for population genomics.

    Science.gov (United States)

    Catchen, Julian; Hohenlohe, Paul A; Bassham, Susan; Amores, Angel; Cresko, William A

    2013-06-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics.

  3. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  4. Nanocoatings for High-Efficiency Industrial Hydraulic and Tooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Clifton B. Higdon III

    2011-01-07

    Industrial manufacturing in the U.S. accounts for roughly one third of the 98 quadrillion Btu total energy consumption. Motor system losses amount to 1.3 quadrillion Btu, which represents the largest proportional loss of any end-use category, while pumps alone represent over 574 trillion BTU (TBTU) of energy loss each year. The efficiency of machines with moving components is a function of the amount of energy lost to heat because of friction between contacting surfaces. The friction between these interfaces also contributes to downtime and the loss of productivity through component wear and subsequent repair. The production of new replacement parts requires additional energy. Among efforts to reduce energy losses, wear-resistant, low-friction coatings on rotating and sliding components offer a promising approach that is fully compatible with existing equipment and processes. In addition to lubrication, one of the most desirable solutions is to apply a protective coating or surface treatment to rotating or sliding components to reduce their friction coefficients, thereby leading to reduced wear. Historically, a number of materials such as diamond-like carbon (DLC), titanium nitride (TiN), titanium aluminum nitride (TiAlN), and tungsten carbide (WC) have been examined as tribological coatings. The primary objective of this project was the development of a variety of thin film nanocoatings, derived from the AlMgB14 system, with a focus on reducing wear and friction in both industrial hydraulics and cutting tool applications. Proof-of-concept studies leading up to this project had shown that the constituent phases, AlMgB14 and TiB2, were capable of producing low-friction coatings by pulsed laser deposition. These coatings combine high hardness with a low friction coefficient, and were shown to substantially reduce wear in laboratory tribology tests. Selection of the two applications was based largely on the concept of improved mechanical interface efficiencies for

  5. CRITICA: coding region identification tool invoking comparative analysis

    Science.gov (United States)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  6. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    DEFF Research Database (Denmark)

    Vedel, Søren; Olesen, Laurits Højgaard; Bruus, Henrik

    2010-01-01

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we...... suggest the use of system-level equivalent circuit theory as an adequate theory of the behavior of the system. A novel pressure source capable of operation in the desired frequency range is presented for this generic analysis. As a proof of concept, we study the fairly complex system of water...

  7. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  8. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  9. The development of a two-component force dynamometer and tool control system for dynamic machine tool research

    Science.gov (United States)

    Sutherland, I. A.

    1973-01-01

    The development is presented of a tooling system that makes a controlled sinusoidal oscillation simulating a dynamic chip removal condition. It also measures the machining forces in two mutually perpendicular directions without any cross sensitivity.

  10. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    Science.gov (United States)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  11. Clinical Decision Support Systems: A Useful Tool in Clinical Practice

    Directory of Open Access Journals (Sweden)

    Kolostoumpis G.

    2012-01-01

    Full Text Available The possibility of supporting in decision – making shows an increase in recent years. Based on mathematic simulation tools, knowledge databases, processing methods, medical data and methods, artificial intelligence for coding of the available knowledge and for resolving complex problems arising into clinical practice. Aim: the aim of this review is to present the development of new methods and modern services, in clinical practice and the emergence in their implementation. Data and methods: the methodology that was followed included research of articles that referred to health sector and modern technologies, at the electronic data bases “pubmed” and “medline”. Results: Is a useful tool for medical experts using characteristics and medical data used by the doctors. Constitute innovation for the medical community, and ensure the support of clinical decisions with an overall way by providing a comprehensive solution in the light of the integration of computational decision support systems into clinical practice. Conclusions: Decision Support Systems contribute to improving the quality of health services with simultaneous impoundment of costs (i.e. avoid medical errors

  12. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  13. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-03

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  14. Phronesis, a diagnosis and recovery tool for system administrators

    CERN Document Server

    Haen, C; Bonaccorsi, E; Neufeld, N

    2014-01-01

    The LHCb experiment relies on the Online system, which includes a very large and heterogeneous computing cluster. Ensuring the proper behavior of the different tasks running on the more than 2000 servers represents a huge workload for the small operator team and is a 24/7 task. At CHEP 2012, we presented a prototype of a framework that we designed in order to support the experts. The main objective is to provide them with steadily improving diagnosis and recovery solutions in case of misbehavior of a service, without having to modify the original applications. Our framework is based on adapted principles of the Autonomic Computing model, on Reinforcement Learning algorithms, as well as innovative concepts such as Shared Experience. While the submission at CHEP 2012 showed the validity of our prototype on simulations, we here present an implementation with improved algorithms and manipulation tools, and report on the experience gained with running it in the LHCb Online system.

  15. Information systems as a tool to improve legal metrology activities

    Science.gov (United States)

    Rodrigues Filho, B. A.; Soratto, A. N. R.; Gonçalves, R. F.

    2016-07-01

    This study explores the importance of information systems applied to legal metrology as a tool to improve the control of measuring instruments used in trade. The information system implanted in Brazil has also helped to understand and appraise the control of the measurements due to the behavior of the errors and deviations of instruments used in trade, allowing the allocation of resources wisely, leading to a more effective planning and control on the legal metrology field. A study case analyzing the fuel sector is carried out in order to show the conformity of fuel dispersers according to maximum permissible errors. The statistics of measurement errors of 167,310 fuel dispensers of gasoline, ethanol and diesel used in the field were analyzed demonstrating the accordance of the fuel market in Brazil to the legal requirements.

  16. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  17. The antibody mining toolbox: an open source tool for the rapid analysis of antibody repertoires.

    Science.gov (United States)

    D'Angelo, Sara; Glanville, Jacob; Ferrara, Fortunato; Naranjo, Leslie; Gleasner, Cheryl D; Shen, Xiaohong; Bradbury, Andrew R M; Kiss, Csaba

    2014-01-01

    In vitro selection has been an essential tool in the development of recombinant antibodies against various antigen targets. Deep sequencing has recently been gaining ground as an alternative and valuable method to analyze such antibody selections. The analysis provides a novel and extremely detailed view of selected antibody populations, and allows the identification of specific antibodies using only sequencing data, potentially eliminating the need for expensive and laborious low-throughput screening methods such as enzyme-linked immunosorbant assay. The high cost and the need for bioinformatics experts and powerful computer clusters, however, have limited the general use of deep sequencing in antibody selections. Here, we describe the AbMining ToolBox, an open source software package for the straightforward analysis of antibody libraries sequenced by the three main next generation sequencing platforms (454, Ion Torrent, MiSeq). The ToolBox is able to identify heavy chain CDR3s as effectively as more computationally intense software, and can be easily adapted to analyze other portions of antibody variable genes, as well as the selection outputs of libraries based on different scaffolds. The software runs on all common operating systems (Microsoft Windows, Mac OS X, Linux), on standard personal computers, and sequence analysis of 1-2 million reads can be accomplished in 10-15 min, a fraction of the time of competing software. Use of the ToolBox will allow the average researcher to incorporate deep sequence analysis into routine selections from antibody display libraries.

  18. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  19. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  20. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  1. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  2. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  3. A Sonification Tool For The Analysis of Large Databases of Expressive Gesture

    Directory of Open Access Journals (Sweden)

    R. Michael Winters

    2012-10-01

    Full Text Available Expert musical performance is rich with movements that facilitate performance accuracy and expressive communication. Studying these movements quantitatively using high-resolution motion capture systems has been fruitful, but analysis is arduous due to the size of the data sets and performance idiosyncrasies. Compared to visual-only methods, sonification provides an interesting alternative that can ease the process of data analysis and provide additional insights. To this end, a sonification tool was designed in Max/MSP that provides interactive access to synthesis mappings and data preprocessing functions that are specific to expressive movement. The tool is evaluated in terms of its ability to fulfil the goals of sonification in this domain and the goals of expressive movement analysis more generally. Additional benefits of sonification are discussed in light of the expressive and musical context.

  4. A new tool for risk analysis and assessment in petrochemical plants

    Directory of Open Access Journals (Sweden)

    El-Arkam Mechhoud

    2016-09-01

    Full Text Available The aim of our work was the implementation of a new automated tool dedicated to risk analysis and assessment in petrochemical plants, based on a combination of two analysis methods: HAZOP (HAZard and OPerability and FMEA (Failure Mode and Effect Analysis. Assessment of accident scenarios is also considered. The principal advantage of the two analysis methods is to speed-up hazard identification and risk assessment and forecast the nature and impact of such accidents. Plant parameters are analyzed under a graphical interface to facilitate the exploitation of our developed approach. This automated analysis brings out the different deviations of the operating parameters of any system in the plant. Possible causes of these deviations, their consequences and preventive actions are identified. The result is risk minimization and dependability enhancement of the considered system.

  5. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... attacks, and attacks launched by insiders. Finally, the perspectives for the application of the analysis techniques are discussed, thereby, coming a small step closer to providing developers with easy- to-use tools for validating the security of networking applications....

  6. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  7. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  8. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  9. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  10. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    Directory of Open Access Journals (Sweden)

    A. Anil Sinaci

    2015-01-01

    Full Text Available Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases.

  11. A tool for monitoring lecturers’ interactions with Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Magdalena Cantabella

    2016-12-01

    Full Text Available Learning Management Systems’ (LMS interaction mechanisms are mainly focused on the improvement of students’ experiences and academic results. However, special attention should also be given to the interaction between these LMS and other actors involved in the educational process. This paper specifically targets the interaction of degree coordinators with LMS when monitoring lecturers’ performance, especially in an online mode. The methodology is guided by the following three objectives: (1 analysis of the limitations of monitoring lecturers in current LMS; (2 development of software program to overcome such limitations; and (3 empirical evaluation of the proposed program. The results show that this type of tool helps coordinators to intuitively and efficiently analyze the status of the subjects taught in their degree programs.

  12. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  13. LCA-IWM: a decision support tool for sustainability assessment of waste management systems.

    Science.gov (United States)

    den Boer, J; den Boer, E; Jager, J

    2007-01-01

    The paper outlines the most significant result of the project 'The use of life cycle assessment tools for the development of integrated waste management strategies for cities and regions with rapid growing economies', which was the development of two decision-support tools: a municipal waste prognostic tool and a waste management system assessment tool. The article focuses on the assessment tool, which supports the adequate decision making in the planning of urban waste management systems by allowing the creation and comparison of different scenarios, considering three basic subsystems: (i) temporary storage; (ii) collection and transport and (iii) treatment, disposal and recycling. The design and analysis options, as well as the assumptions made for each subsystem, are shortly introduced, providing an overview of the applied methodologies and technologies. The sustainability assessment methodology used in the project to support the selection of the most adequate scenario is presented with a brief explanation of the procedures, criteria and indicators applied on the evaluation of each of the three sustainability pillars.

  14. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    Science.gov (United States)

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies.

  15. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  16. A Tool for Qualitative Causal Reasoning On Complex Systems

    Directory of Open Access Journals (Sweden)

    Tahar Guerram

    2010-11-01

    Full Text Available A cognitive map, also called a mental map, is a representation and reasoning model on causal knowledge. It is a directed, labeled and cyclic graph whose nodes represent causes or effects and whose arcs represent causal relations between these nodes such as "increases", "decreases", "supports", and "disadvantages". A cognitive map represents beliefs (knowledge which we lay out about a given domain of discourse and is useful as a means of decision making support. There are several types of cognitive maps but the most used are fuzzy cognitive maps. This last treat the cases of existence and no nexistence of relations between nodes but does not deal with the case when these relations are indeterminate. Neutrosophic cognitive maps proposed by F. Smarandache make it possible to take into account these indetermination and thus constitute an extension of fuzzy cognitive maps. This article tries to propose a modeling and reasoning tool for complex systems based on neutrosophic cognitive maps. In order to be able to evaluate our work, we applied our tool to a medical case which is the viral infection biological process.

  17. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    Science.gov (United States)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  18. Computational Aeroacoustic Analysis System Development

    Science.gov (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  19. CoolPack – Simulation tools for refrigeration systems

    DEFF Research Database (Denmark)

    Jakobsen, Arne; Rasmussen, Bjarne D.; Andersen, Simon Engedal

    1999-01-01

    CoolPack is a collection of programs used for energy analysis and optimisation of refrigeration systems. CoolPack is developed at the Department of Energy Engineering at the Technical University of Denmark. The Danish Energy Agency finances the project. CoolPack is freeware and can be downloaded...

  20. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    Science.gov (United States)

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  1. The Virtual Habitat - a tool for Life Support Systems optimization

    Science.gov (United States)

    Czupalla, Markus; Dirlich, Thomas; Harder, Jan; Pfeiffer, Matthias

    In the course of designing Life Support Systems (LSS) a great multitude of concepts for and various combinations of subsystems and components are developed. In order to find an optimal LSS solution, thus the right combination of subsystems, the parameters for the definition of the optimization itself have to be determined. The often times used Equivalent Systems Mass (ESM) based trade study approach for life support systems is well suited for phase A conceptual design evaluations. The ESM approach allows an efficient evaluation of LSS on a component or subsystem level. The necessary next step in the process is the design, evaluation and optimization of the LSS on a system level. For the system level LSS design a classic ESM-based trade study seems not to be able to provide the information that is necessary to evaluate the concept correctly. Important decisive criteria such as system stability, controllability and effectiveness are not represented in the ESM approach. These parameters directly and decisively impact the scientific efficiency of the crew, thereby the mission in total. Thus, for system level optimization these criteria must be included alongside the ESM in a new integral optimization method. In order to be able to apply such an integral criterion dynamic modeling of most involved LSS subsystems, especially of the human crew, is necessary. Only then the required information about the efficiency of the LSS, over time, e.g. the systems stability, becomes available. In an effort to establish a dynamic simulation environment for habitats in extreme environmental conditions, the "Virtual Habitat" tool is being developed by the Human Spaceflight Group of the Technische Universit¨t M¨nchen (TUM). The paper discussed here presents the concept of a u the virtual habitat simulation. It discusses in what way the simulation tool enables a prediction of system characteristics and required information demanded by an integral optimization criterion. In general the

  2. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  3. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    Science.gov (United States)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  4. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    Science.gov (United States)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  5. Visualization tool. 3DAVS and polarization-type VR system

    Energy Technology Data Exchange (ETDEWEB)

    Takeda, Yasuhiro [Fujitsu Tokushima Systems Engineering Limited, Tokushima (Japan); Ueshima, Yutaka [Japan Atomic Energy Research Inst., Kizu, Kyoto (Japan)

    2003-01-01

    In the visualization work of simulation data in every advanced research field, what is used most in the report or the presentation as a research result has still remained in the stages of the still picture or the 2-dimensional animation, in spite of recent abundance of various visualization software. With the recent progress of computational environment, however, more complicated phenomena can be so easily computed that the results are more needed to be comprehensible as well as intelligible. Therefore, it inevitably requires an animation rather than a still picture, or 3-dimensional display (virtual reality) rather than 2-dimensional one. In this report, two visualization tools, 3DAVS and Polarization-Type VR system are described as the data expression method after visualization processing. (author)

  6. TAQL: A Problem Space Tool for Expert System Development.

    Science.gov (United States)

    1992-05-01

    tools developed for use in Round 3. Prior to building the tools, the average fix time for errors that would have been catchable was nearly identical to...the average fix time for errors that would not have been catchable . After building the tools, uncatchabie errors took three times longer to fix than... catchable errors. I conclude that the model tools are highly effective for catching space and data model errors and that this translated into reduced

  7. Microbiology and Molecular Biology Tools for Biogas Process Analysis, Diagnosis and Control.

    Science.gov (United States)

    Lebuhn, Michael; Weiß, Stefan; Munk, Bernhard; Guebitz, Georg M

    2015-01-01

    Many biotechnological processes such as biogas production or defined biotransformations are carried out by microorganisms or tightly cooperating microbial communities. Process breakdown is the maximum credible accident for the operator. Any time savings that can be provided by suitable early-warning systems and allow for specific countermeasures are of great value. Process disturbance, frequently due to nutritional shortcomings, malfunction or operational deficits, is evidenced conventionally by process chemistry parameters. However, knowledge on systems microbiology and its function has essentially increased in the last two decades, and molecular biology tools, most of which are directed against nucleic acids, have been developed to analyze and diagnose the process. Some of these systems have been shown to indicate changes of the process status considerably earlier than the conventionally applied process chemistry parameters. This is reasonable because the triggering catalyst is determined, activity changes of the microbes that perform the reaction. These molecular biology tools have thus the potential to add to and improve the established process diagnosis system. This chapter is dealing with the actual state of the art of biogas process analysis in practice, and introduces molecular biology tools that have been shown to be of particular value in complementing the current systems of process monitoring and diagnosis, with emphasis on nucleic acid targeted molecular biology systems.

  8. IT Tools for Foresight: The Integrated Insight and Response System of Deutsche Telekom Innovation Laboratories

    DEFF Research Database (Denmark)

    Rohrbeck, René; Thom, Nico; Arnold, Heinrich

    2015-01-01

    . The overall system consists of a tool for scanning for weak signals on change (PEACOQ Scouting Tool), a tool for collecting internal ideas (PEACOQ Gate 0.5), and a tool for triggering organizational responses (Foresight Landing page). Particularly the link to innovation management and R&D strategy...

  9. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  10. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  11. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  12. Bioanalyzer: An Efficient Tool for Sequence Retrieval, Analysis and Manipulation

    Directory of Open Access Journals (Sweden)

    Hassan Tariq

    2010-12-01

    Full Text Available Bioanalyzer provides combination of tools that are never assembled together. Software has list of tools that can be important for different researchers. The aim to develop this kind of software is to provide unique set of tools at one platform in a more efficient and better way than the software or web tools available. It is stand-alone application so it can save time and effort to locate individual tools on net. Flexible design has made it easy to expand it in future. We will make it available publicly soon.

  13. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... the number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material...

  14. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  15. Abstract Interfaces for Data Analysis —Component Architecture for Data Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    G.Barrand; P.Binko; 等

    2001-01-01

    The fast turnover of software technologies,in particular in the domain of in teractivity(covering user interface and visualisation)makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete.At the HepVis '99 workshop,a working group has been formed to improve the rpoduction of software tools for data analysis in HENP.Beside promoting a distributed development organisation,one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques.An initial domain analysis has come up with several categories(componets)found in typical data analysis tools:historams,Ntuples,Functions,Vectors,Fitter,Plotter,Analyzer and Controller,Special Emphasis was put on reducing the couplings between the categories to a minimum,thus optimising re-use and maintainability of any component individually.The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++(Anaphe/Lizard,Openscientist)and Java(Java Analysis Studio),A special implementation aims at accessing the Java Liraries(through their Abstract Interfaces)from C++.This paper giver an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  16. Analysis of topology changes in multibody systems

    OpenAIRE

    2009-01-01

    Mechanical systems with time-varying topology appear frequently in natural or human-made artificial systems. The nature of topology transitions is a key characteristic in the functioning of such systems. In this paper, a concept to decouple kinematic and kinetic quantities at the time of topology transition is used. This approach is based on the use of impulsive bilateral constraints and it is a useful tool for the analysis of energy redistribution and velocity change when these constraint...

  17. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  18. Engineering intracellular active transport systems as in vivo biomolecular tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bachand, George David; Carroll-Portillo, Amanda

    2006-11-01

    Active transport systems provide essential functions in terms of cell physiology and metastasis. These systems, however, are also co-opted by invading viruses, enabling directed transport of the virus to and from the cell's nucleus (i.e., the site of virus replication). Based on this concept, fundamentally new approaches for interrogating and manipulating the inner workings of living cells may be achievable by co-opting Nature's active transport systems as an in vivo biomolecular tool. The overall goal of this project was to investigate the ability to engineer kinesin-based transport systems for in vivo applications, specifically the collection of effector proteins (e.g., transcriptional regulators) within single cells. In the first part of this project, a chimeric fusion protein consisting of kinesin and a single chain variable fragment (scFv) of an antibody was successfully produced through a recombinant expression system. The kinesin-scFv retained both catalytic and antigenic functionality, enabling selective capture and transport of target antigens. The incorporation of a rabbit IgG-specific scFv into the kinesin established a generalized system for functionalizing kinesin with a wide range of target-selective antibodies raised in rabbits. The second objective was to develop methods of isolating the intact microtubule network from live cells as a platform for evaluating kinesin-based transport within the cytoskeletal architecture of a cell. Successful isolation of intact microtubule networks from two distinct cell types was demonstrated using glutaraldehyde and methanol fixation methods. This work provides a platform for inferring the ability of kinesin-scFv to function in vivo, and may also serve as a three-dimensional scaffold for evaluating and exploiting kinesin-based transport for nanotechnological applications. Overall, the technology developed in this project represents a first-step in engineering active transport system for in vivo

  19. Geothermal Prospector: Supporting Geothermal Analysis Through Spatial Data Visualization and Querying Tools

    Energy Technology Data Exchange (ETDEWEB)

    Getman, Daniel; Anderson, Arlene; Augustine, Chad

    2015-09-02

    Determining opportunities for geothermal energy can involve a significant investment in data collection and analysis. Analysts within a variety of industry and research domains collect and use these data; however, determining the existence and availability of data needed for a specific analysis activity can be challenging and represents one of the initial barriers to geothermal development [2]. This paper describes the motivating factors involved in designing and building the Geothermal Prospector application, how it can be used to reduce risks and costs related to geothermal exploration, and where it fits within the larger collection of tools that is the National Geothermal Data System (NGDS) [5].

  20. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    Science.gov (United States)

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction.