WorldWideScience

Sample records for analysis system tool

  1. Two energy system analysis - tools

    DEFF Research Database (Denmark)

    Lund, Henrik; Andersen, Anders N.; Antonoff, Jayson

    2004-01-01

    The chapter illustrates some of the different types of problems that must be solved when analysing energy systems.......The chapter illustrates some of the different types of problems that must be solved when analysing energy systems....

  2. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  3. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  4. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  5. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...

  6. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  7. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  8. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...... model checking, model-checking tools, symbolic model checking, visual tools, and verification of critical systems....

  9. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  10. Shape Analysis for Complex Systems Using Information Geometry Tools.

    OpenAIRE

    Sanctis, Angela De

    2012-01-01

    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  11. MEASUREMENT UNCERTAINTY ANALYSIS OF DIFFERENT CNC MACHINE TOOLS MEASUREMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    Leszek Semotiuk

    2013-09-01

    Full Text Available In this paper the results of measurement uncertainty tests conducted with a Heidenhain TS 649 probe on CNC machine tools are presented. In addition, identification and analysis of random and systematic errors of measurement were presented. Analyses were performed on the basis of measurements taken on two different CNC machine tools with Heidenhain control system. The evaluated errors were discussed and compensation procedures were proposed. The obtained results were described in tables and figures.

  12. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  13. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  14. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  15. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  16. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  17. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  18. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  19. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  20. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  1. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    Science.gov (United States)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  2. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  3. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  4. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  5. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  6. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  7. SYSTID - A flexible tool for the analysis of communication systems.

    Science.gov (United States)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  8. Design and Analysis Tools for Concurrent Blackboard Systems

    Science.gov (United States)

    McManus, John W.

    1991-01-01

    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  9. Bayesian networks as a tool for epidemiological systems analysis

    OpenAIRE

    Lewis, F.I.

    2012-01-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter ...

  10. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  11. Betweenness as a Tool of Vulnerability Analysis of Power System

    Science.gov (United States)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-06-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  12. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  13. Bayesian networks as a tool for epidemiological systems analysis

    Science.gov (United States)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  14. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    Science.gov (United States)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  15. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search...

  16. Cellular barcoding tool for clonal analysis in the hematopoietic system

    NARCIS (Netherlands)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J.; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J. C.; de Haan, Gerald; Bystrykh, Leonid V.

    2010-01-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blott

  17. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  18. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  19. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  20. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  1. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...

  2. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    Science.gov (United States)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  3. Transient analysis of power systems solution techniques, tools and applications

    CERN Document Server

    Martinez-Velasco, J

    2014-01-01

    A comprehensive introduction and up-to-date reference to SiC power semiconductor devices covering topics from material properties to applicationsBased on a number of breakthroughs in SiC material science and fabrication technology in the 1980s and 1990s, the first SiC Schottky barrier diodes (SBDs) were released as commercial products in 2001.  The SiC SBD market has grown significantly since that time, and SBDs are now used in a variety of power systems, particularly switch-mode power supplies and motor controls.  SiC power MOSFETs entered commercial production in 2011, providing rugged, hig

  4. Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis

    Science.gov (United States)

    Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige

    2006-01-01

    We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.

  5. Using EPSAT to analyze high power systems in the space environment. [Environment Power System Analysis Tool

    Science.gov (United States)

    Kuharski, Robert A.; Jongeward, Gary A.; Wilcox, Katherine G.; Rankin, Tom R.; Roche, James C.

    1991-01-01

    The authors review the Environment Power System Analysis Tool (EPSAT) design and demonstrate its capabilities by using it to address some questions that arose in designing the SPEAR III experiment. It is shown that that the rocket body cannot be driven to large positive voltages under the constraints of this experiment. Hence, attempts to measure the effects of a highly positive rocket body in the plasma environment should not be made in this experiment. It is determined that a hollow cathode will need to draw only about 50 mA to ground the rocket body. It is shown that a relatively small amount of gas needs to be released to induce a bulk breakdown near the rocket body, and this gas release should not discharge the sphere. Therefore, the experiment provides an excellent opportunity to study the neutralization of a differential charge.

  6. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  7. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  8. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  9. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Energy Technology Data Exchange (ETDEWEB)

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering

    2006-06-15

    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  10. User manual of the CATSS system (version 1.0) communication analysis tool for space station

    Science.gov (United States)

    Tsang, C. S.; Su, Y. T.; Lindsey, W. C.

    1983-01-01

    The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.

  11. SBML-SAT: a systems biology markup language (SBML based sensitivity analysis tool

    Directory of Open Access Journals (Sweden)

    Rundell Ann E

    2008-08-01

    Full Text Available Abstract Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  12. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  13. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    Science.gov (United States)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  14. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    Science.gov (United States)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  15. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    Science.gov (United States)

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark

    2011-01-01

    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  16. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  17. A new tool for the performance analysis of massively parallel computer systems

    CERN Document Server

    Stefanek, Anton; Bradley, Jeremy; 10.4204/EPTCS.28.11

    2010-01-01

    We present a new tool, GPA, that can generate key performance measures for very large systems. Based on solving systems of ordinary differential equations (ODEs), this method of performance analysis is far more scalable than stochastic simulation. The GPA tool is the first to produce higher moment analysis from differential equation approximation, which is essential, in many cases, to obtain an accurate performance prediction. We identify so-called switch points as the source of error in the ODE approximation. We investigate the switch point behaviour in several large models and observe that as the scale of the model is increased, in general the ODE performance prediction improves in accuracy. In the case of the variance measure, we are able to justify theoretically that in the limit of model scale, the ODE approximation can be expected to tend to the actual variance of the model.

  18. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    Science.gov (United States)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  19. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying......Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...

  20. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    Science.gov (United States)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  1. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  2. Pickering tool management system

    International Nuclear Information System (INIS)

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  3. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    Science.gov (United States)

    Csank, Jeffrey; Zinnecker, Alicia

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.

  4. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  5. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one o

  6. The integrated microbial genomes (IMG) system in 2007: datacontent and analysis tool extensions

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Szeto, Ernest; Palaniappan, Krishna; Grechkin, Yuri; Chu, Ken; Chen, I-Min A.; Dubchak, Inna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2007-08-01

    The Integrated Microbial Genomes (IMG) system is a data management, analysis and annotation platform for all publicly available genomes. IMG contains both draft and complete JGI microbial genomes integrated with all other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and annotating genomes, genes and functions, individually or in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through quarterly releases. IMG is provided by the DOE-Joint Genome Institute (JGI) and is available from http://img.jgi.doe.gov.

  7. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities. The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.

  8. Instantaneous Purified Orbit: A New Tool for Analysis of Nonstationary Vibration of Rotor System

    Directory of Open Access Journals (Sweden)

    Shi Dongfeng

    2001-01-01

    Full Text Available In some circumstances, vibration signals of large rotating machinery possess time-varying characteristics to some extent. Traditional diagnosis methods, such as FFT spectrum and orbit diagram, are confronted with a huge challenge to deal with this problem. This work aims at studying the four intrinsic drawbacks of conventional vibration signal processing method and instantaneous purified orbit (IPO on the basis of improved Fourier spectrum (IFS to analyze nonstationary vibration. On account of integration, the benefits of short period Fourier transform (SPFT and regular holospectrum, this method can intuitively reflect vibration characteristics of’a rotor system by means of parameter analysis for corresponding frequency ellipses. Practical examples, such as transient vibration in run-up stages and bistable condition of rotor show that IPO is a powerful tool for diagnosis and analysis of the vibration behavior of rotor systems.

  9. Tav4SB: integrating tools for analysis of kinetic models of biological systems

    Directory of Open Access Journals (Sweden)

    Rybiński Mikołaj

    2012-04-01

    Full Text Available Abstract Background Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. Results The Taverna services for Systems Biology (Tav4SB project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project’s Web page: http://bioputer.mimuw.edu.pl/tav4sb/. Conclusions The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  10. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    Directory of Open Access Journals (Sweden)

    Mari-Carmen Mochón

    2016-03-01

    Full Text Available After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order to identify the potential risks hidden behind the market. Big Data solutions currently applied to the Social Network Analysis (SNA, can be successfully applied the systemic risk supervision. This case of study proposes how relations established between the financial market participants could be analyzed, in order to identify risk of propagation and market behavior, without the necessity of expensive and demanding technical architectures.

  11. Systems Analysis - a new paradigm and decision support tools for the water framework directive

    Science.gov (United States)

    Bruen, M.

    2008-05-01

    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.

  12. Systems Analysis – a new paradigm and decision support tools for the water framework directive

    Directory of Open Access Journals (Sweden)

    M. Bruen

    2008-05-01

    Full Text Available In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE and the Analytical Hierarchy Process (AHP are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.

  13. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  14. Design and analysis of a reconfigurable discrete pin tooling system for molding of three-dimensional free-form objects

    OpenAIRE

    KOÇ, Bahattin; Koc, Bahattin; Thangaswamy, Sridhar

    2010-01-01

    This paper presents the design and analysis of a new reconfigurable tooling for the fabrication of three-dimensional (3D) free-form objects. The proposed reconfigurable tooling system comprises a set of matrices of a closely stacked discrete elements (i.e., pins) arranged to form a cavity in which a free-form object can be molded. By reconfiguring the pins, a single tool can be used in the place of multiple tools to produce different parts with the involvement of much lesser time and cost. Th...

  15. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  16. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    DEFF Research Database (Denmark)

    Clavreul, Julie

    Since the late 1990s, life cycle assessment (LCA) has been increasingly applied to waste management to quantify direct, indirect and avoided impacts from various treatment options. The construction of inventories for waste management systems differs from classical product-LCAs in that (1) these s...... for economic analysis, an improved graphical display of results, the design of new process templates, the provision of an external editor of process templates and the development of new functionalities for the impact assessment phase.......) these systems usually handle a heterogeneous mix of different waste fractions, (2) optimal treatments differ for these various fractions due to their chemical and physical properties and (3) emissions from final disposal places may occur over a very long time, depending on technology choice, and thus they have...... and databases and the application of uncertainty analysis methods. The major outcome of this thesis was the development of a new LCA model, called EASETECH, building on the experience with previous LCA-tools, in particular the EASEWASTE model. Before the actual implementation phase, a design phase involved...

  17. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    Science.gov (United States)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to

  18. Analysis tools for simulation of hybrid systems; Herramientas de analisis para simulacion de sistemas hibridos

    Energy Technology Data Exchange (ETDEWEB)

    Guillen S, Omar; Mejia N, Fortino [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2005-07-01

    In order to facilitate and to simplify the development and analysis of a Hybrid System in reference to its design, construction, operation and maintenance, it turns out optimal to carry out the simulation of this one by means of software, with which a significant reduction in the investment costs is obtained. Given the mix of technology of electrical generation which is involved in a hybrid system, it is very important to have a tool integrated with specialized packages of calculation (software), that allow to carry out the simulation tasks of the operational functioning of these systems. Combined with the former, one must not fail to consider the operation characteristics, the facilities of the user, the clarity in the obtained results and the possibility of its validation with respect to prototypes orchestrated in field. Equally, it is necessary to consider the identification of tasks involved in relation to the place of installation of this electrification technology. At the moment, the hybrid systems technology still is in a stage of development in the international level, and exist important limitations as far as the methodology availability and engineering tools for the optimum design of these systems. With the development of this paper, it is intended to contribute to the advance of the technology and to count on own tools to solve the described series of problems. In this article are described the activities that more impact have in the design and development of hybrid systems, as well as the identification of variables, basic characteristics and form of validation of tools in the integration of a methodology for the simulation of these systems, facilitating their design and development. [Spanish] Para facilitar y simplificar el desarrollo y analisis de un Sistema Hibrido en lo que refiere a su diseno, construccion, operacion y mantenimiento, resulta optimo efectuar la simulacion de este por medio de un software, con lo que se obtiene una reduccien

  19. Experimental Analysis of Browser based Novel Anti-Phishing System Tool at Educational Level

    Directory of Open Access Journals (Sweden)

    Rajendra Gupta

    2016-02-01

    Full Text Available In the phishing attack, the user sends their confidential information on mimic websites and face the financial problem, so the user should be informed immediately about the visiting website. According to the Third Quarter Phishing Activity Trends Report, there are 55,282 new phishing websites have been detected in the month of July 2014. To solve the phishing problem, a browser based add-on system may be one of the best solution to aware the user about the website type. In this paper, a novel browser based add-on system is proposed and compared its performance with the existing antiphishing tools. The proposed anti-phishing tool ‘ePhish’ is compared with the existing browser based antiphishing toolbars. All the anti-phishing tools have been installed in computer systems at an autonomous college to check their performance. The obtained result shows that if the task is divided into a group of systems, it can give better results. For different phishing features, the add-on system tool show around 97 percentage successful results at different case conditions. The current study would be very helpful to countermeasure the phishing attach and the proposed system is able to protect the user by phishing attacks. Since the system tool is capable of handling and managing the phishing website details, so it would be helpful to identify the category of the websites.

  20. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard;

    2005-01-01

    workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  1. Genetic Tools for the Analysis of Drosophila Stomatogastric Nervous System Development.

    Science.gov (United States)

    Hernández, Karla; Myers, Logan G; Bowser, Micah; Kidd, Thomas

    2015-01-01

    The Drosophila stomatogastric nervous system (SNS) is a compact collection of neurons that arises from the migration of neural precursors. Here we describe genetic tools allowing functional analysis of the SNS during the migratory phase of development. We constructed GAL4 lines driven by fragments of the Ret promoter, which yielded expression in a subset of migrating neural SNS precursors and also included a distinct set of midgut associated cells. Screening of additional GAL4 lines driven by fragments of the Gfrl/Munin, forkhead, twist and goosecoid (Gsc) promoters identified a Gsc fragment with expression from initial selection of SNS precursors until the end of embryogenesis. Inhibition of EGFR signaling using three identified lines disrupted the correct patterning of the frontal and recurrent nerves. To manipulate the environment traveled by SNS precursors, a FasII-GAL4 line with strong expression throughout the entire intestinal tract was identified. The transgenic lines described offer the ability to specifically manipulate the migration of SNS precursors and will allow the modeling and in-depth analysis of neuronal migration in ENS disorders such as Hirschsprung's disease. PMID:26053861

  2. Genetic Tools for the Analysis of Drosophila Stomatogastric Nervous System Development.

    Directory of Open Access Journals (Sweden)

    Karla Hernández

    Full Text Available The Drosophila stomatogastric nervous system (SNS is a compact collection of neurons that arises from the migration of neural precursors. Here we describe genetic tools allowing functional analysis of the SNS during the migratory phase of development. We constructed GAL4 lines driven by fragments of the Ret promoter, which yielded expression in a subset of migrating neural SNS precursors and also included a distinct set of midgut associated cells. Screening of additional GAL4 lines driven by fragments of the Gfrl/Munin, forkhead, twist and goosecoid (Gsc promoters identified a Gsc fragment with expression from initial selection of SNS precursors until the end of embryogenesis. Inhibition of EGFR signaling using three identified lines disrupted the correct patterning of the frontal and recurrent nerves. To manipulate the environment traveled by SNS precursors, a FasII-GAL4 line with strong expression throughout the entire intestinal tract was identified. The transgenic lines described offer the ability to specifically manipulate the migration of SNS precursors and will allow the modeling and in-depth analysis of neuronal migration in ENS disorders such as Hirschsprung's disease.

  3. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  4. MARSTHERM: A Web-based System Providing Thermophysical Analysis Tools for Mars Research

    Science.gov (United States)

    Putzig, N. E.; Barratt, E. M.; Mellon, M. T.; Michaels, T. I.

    2013-12-01

    We introduce MARSTHERM, a web-based system that will allow researchers access to a standard numerical thermal model of the Martian near-surface and atmosphere. In addition, the system will provide tools for the derivation, mapping, and analysis of apparent thermal inertia from temperature observations by the Mars Global Surveyor Thermal Emission Spectrometer (TES) and the Mars Odyssey Thermal Emission Imaging System (THEMIS). Adjustable parameters for the thermal model include thermal inertia, albedo, surface pressure, surface emissivity, atmospheric dust opacity, latitude, surface slope angle and azimuth, season (solar longitude), and time steps for calculations and output. The model computes diurnal surface and brightness temperatures for either a single day or a full Mars year. Output options include text files and plots of seasonal and diurnal surface, brightness, and atmospheric temperatures. The tools for the derivation and mapping of apparent thermal inertia from spacecraft data are project-based, wherein the user provides an area of interest (AOI) by specifying latitude and longitude ranges. The system will then extract results within the AOI from prior global mapping of elevation (from the Mars Orbiter Laser Altimeter, for calculating surface pressure), TES annual albedo, and TES seasonal and annual-mean 2AM and 2PM apparent thermal inertia (Putzig and Mellon, 2007, Icarus 191, 68-94). In addition, a history of TES dust opacity within the AOI is computed. For each project, users may then provide a list of THEMIS images to process for apparent thermal inertia, optionally overriding the TES-derived dust opacity with a fixed value. Output from the THEMIS derivation process includes thumbnail and context images, GeoTIFF raster data, and HDF5 files containing arrays of input and output data (radiance, brightness temperature, apparent thermal inertia, elevation, quality flag, latitude, and longitude) and ancillary information. As a demonstration of capabilities

  5. A Conceptual Wing Flutter Analysis Tool for Systems Analysis and Parametric Design Study

    Science.gov (United States)

    Mukhopadhyay, Vivek

    2003-01-01

    An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate flutt er instability boundaries of a typical wing, when detailed structural and aerodynamic data are not available. Effects of change in key flu tter parameters can also be estimated in order to guide the conceptual design. This userfriendly software was developed using MathCad and M atlab codes. The analysis method was based on non-dimensional paramet ric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on wing torsion stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravit y location and pitch-inertia radius of gyration. These parametric plo ts were compiled in a Chance-Vought Corporation report from database of past experiments and wind tunnel test results. An example was prese nted for conceptual flutter analysis of outer-wing of a Blended-Wing- Body aircraft.

  6. Exploration tools in formal concept analysis

    OpenAIRE

    Stumme, Gerd

    1996-01-01

    The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.

  7. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  8. Neutron multiplicity analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Scott L [Los Alamos National Laboratory

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  9. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract......, analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...... and analyze web data in the process of investigating substantive questions....

  10. A Collaborative Analysis Tool for Thermal Protection Systems for Single Stage to Orbit Launch Vehicles

    Science.gov (United States)

    Alexander, Reginald; Stanley, Thomas Troy

    2001-01-01

    Presented is a design tool and process that connects several disciplines which are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to all other systems, as is the case with SSTO vehicles with air breathing propulsion, which is currently being studied by the National Aeronautics and Space Administration (NASA). In particular, the thermal protection system (TPS) is linked directly to almost every major system. The propulsion system pushes the vehicle to velocities on the order of 15 times the speed of sound in the atmosphere before pulling up to go to orbit which results in high temperatures on the external surfaces of the vehicle. Thermal protection systems to maintain the structural integrity of the vehicle must be able to mitigate the heat transfer to the structure and be lightweight. Herein lies the interdependency, in that as the vehicle's speed increases, the TPS requirements are increased. And as TPS masses increase the effect on the propulsion system and all other systems is compounded. To adequately calculate the TPS mass of this type of vehicle several engineering disciplines and analytical tools must be used preferably in an environment that data is easily transferred and multiple iterations are easily facilitated.

  11. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  12. Environmental control and life support system analysis tools for the Space Station era

    Science.gov (United States)

    Blakely, R. L.; Rowell, L. F.

    1984-01-01

    This paper describes the concept of a developing emulation, simulation, sizing, and technology assessment program (ESSTAP) which can be used effectively for the various functional disciplines (structures, power, ECLSS, etc.) beginning with the initial system selection and conceptual design processes and continuing on through the mission operation and growth phases of the Space Station for the purpose of minimizing overall program costs. It will discuss the basic requirements for these tools, as currently envisioned for the Environmental Control and Life Support System (ECLSS), identifying their intended and potential uses and applications, and present examples and status of several representative tools. The development and applications of a Space Station Atmospheric Revitalization Subsystem (ARS) demonstration model to be used for concent verification will also be discussed.

  13. Social Network Analysis and Big Data tools applied to the Systemic Risk supervision

    OpenAIRE

    Mari-Carmen Mochón

    2016-01-01

    After the financial crisis initiated in 2008, international market supervisors of the G20 agreed to reinforce their systemic risk supervisory duties. For this purpose, several regulatory reporting obligations were imposed to the market participants. As a consequence, millions of trade details are now available to National Competent Authorities on a daily basis. Traditional monitoring tools may not be capable of analyzing such volumes of data and extracting the relevant information, in order t...

  14. Energy-Sustainable Framework and Performance Analysis of Power Scheme for Operating Systems: A Tool

    Directory of Open Access Journals (Sweden)

    G. Singh

    2012-12-01

    Full Text Available Recently, an Information and Communications Technology (ICT devices has become more user-friendly, which raised the problem of power dissipation across the globe and computer systems are one among them. This emerging issue of power dissipation has imposed a very significant issue on the system and software design. The concept of ‘green computing’ gaining popularity and is being considered as one of the most promising technology by the designers of Information Technology (IT industry, which demonstrate the environmentally responsible way to reduce the power consumption and maximize the energy efficiency. In this paper, we have proposed an energy sustainable framework of the power schemes for operating systems to reduce the power consumption by computer systems and presented a Green Power tool (GP tool. This tool is designed using JAVA technology, which requires least configuration to make a decision for reducing the power consumption and proposed Swift mode algorithm, allows users to input the working time of their choice then after the end of time algorithm starts detection of human activity on the computer system. We also compared the Swift mode algorithm with existing power scheme in the operating system that provides up to 66% of the power saving. Finally, we have profiled the proposed framework to analyze the memory and Central Processing Unit (CPU performance, which demonstrated that there is no memory leakage or CPU degradation problem and framework’s behavior remain constant under various overhead scenarios of the memory as well as CPU. The proposed framework requires 3–7 MB memory space during its execution.

  15. Motion analysis systems as optimization training tools in combat sports and martial arts

    Directory of Open Access Journals (Sweden)

    Ewa Polak

    2016-01-01

    Full Text Available Introduction: Over the past years, a few review papers about possibilities of using motion analysis systems in sport were published, but there are no articles that discuss this problem in the field of combat sports and martial arts. Aim: This study presents the diversity of contemporary motion analysis systems both, those that are used in scientific research, as well as those that can be applied in daily work of coaches and athletes in combat sports and martial arts. An additional aim is the indication of example applications in scientific research and range of applications in optimizing the training process. It presents a brief description of each type of systems that are currently used in sport, specific examples of systems and the main advantages and disadvantages of using them. The presentation and discussion takes place in the following sections: motion analysis utility for combat sports and martial arts, systems using digital video and systems using markers, sensors or transmitters. Conclusions: Not all types of motion analysis systems used in sport are suitable for combat sports and martial arts. Scientific studies conducted so far showed the usefulness of video-based, optical and electromechanical systems. The use of research results made with complex motion analysis systems, or made with simple systems, local application and immediate visualization is important for the preparation of training and its optimization. It may lead to technical and tactical improvement in athletes as well as the prevention of injuries in combat sports and martial arts.

  16. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  17. Modeling and control system design and analysis tools for flexible structures

    Science.gov (United States)

    Anissipour, Amir A.; Benson, Russell A.; Coleman, Edward E.

    1989-01-01

    Described here are Boeing software tools used for the development of control laws of flexible structures. The Boeing Company has developed a software tool called Modern Control Software Package (MPAC). MPAC provides the environment necessary for linear model development, analysis, and controller design for large models of flexible structures. There are two features of MPAC which are particularly appropriate for use with large models: (1) numerical accuracy and (2) label-driven nature. With the first feature MPAC uses double precision arithmetic for all numerical operations and relies on EISPAC and LINPACK for the numerical foundation. With the second feature, all MPAC model inputs, outputs, and states are referenced by user-defined labels. This feature allows model modification while maintaining the same state, input, and output names. In addition, there is no need for the user to keep track of a model variable's matrix row and colunm locations. There is a wide range of model manipulation, analysis, and design features within the numerically robust and flexible environment provided by MPAC. Models can be built or modified using either state space or transfer function representations. Existing models can be combined via parallel, series, and feedback connections; and loops of a closed-loop model may be broken for analysis.

  18. Bifurcation Tools for Flight Dynamics Analysis and Control System Design Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Modern bifurcation analysis methods have been proposed for investigating flight dynamics and control system design in highly nonlinear regimes and also for the...

  19. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    Science.gov (United States)

    Kinter, James L., III

    1993-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science dataset browsing, sampling and manipulation. The system will be coupled to a supercomputer in a distributed computing environment for near real-time interaction between scientists and computational results.

  20. Integration of Systems Network (SysNet) tools for regional land use scenario analysis in Asia

    NARCIS (Netherlands)

    Roetter, R.P.; Hoanh, C.T.; Laborte, A.G.; Keulen, van H.; Ittersum, van M.K.; Dreiser, C.; Diepen, van C.A.; Ridder, de N.; Laar, van H.H.

    2005-01-01

    This paper introduces the approach of the Systems research Network (SysNet) for land use planning in tropical Asia with a focus on its main scientific-technical output: the development of the land use planning and analysis system (LUPAS) and its component models. These include crop simulation models

  1. The NEPLAN software package a universal tool for electric power systems analysis

    CERN Document Server

    Kahle, K

    2002-01-01

    The NEPLAN software package has been used by CERN's Electric Power Systems Group since 1997. The software is designed for the calculation of short-circuit currents, load flow, motor start, dynamic stability, harmonic analysis and harmonic filter design. This paper describes the main features of the software package and their application to CERN's electric power systems. The implemented models of CERN's power systems are described in detail. Particular focus is given to fault calculations, harmonic analysis and filter design. Based on this software package and the CERN power network model, several recommendations are given.

  2. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.;

    2016-01-01

    for the main process metrics, providing feedback to the research and development team and setting goals for experimental efforts. The present study proposes a methodology for performing such a "retro" techno-economic analysis. It consists of choosing the most important variables of the process and finding......Economic analysis, allied to process systems engineering tools, can provide useful insights about process techno-economic feasibility. More interestingly, rather than being used to evaluate specific process conditions, this techno-economic analysis can be turned upside down to achieve target values...

  3. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  4. Collaborative Analysis Tool for Thermal Protection Systems for Single Stage to Orbit Launch Vehicles

    Science.gov (United States)

    Alexander, Reginald Andrew; Stanley, Thomas Troy

    1999-01-01

    Presented is a design tool and process that connects several disciplines which are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to every other system and in the case of SSTO vehicles with air breathing propulsion, which is currently being studied by the National Aeronautics and Space Administration (NASA); the thermal protection system (TPS) is linked directly to almost every major system. The propulsion system pushes the vehicle to velocities on the order of 15 times the speed of sound in the atmosphere before pulling up to go to orbit which results high temperatures on the external surfaces of the vehicle. Thermal protection systems to maintain the structural integrity of the vehicle must be able to mitigate the heat transfer to the structure and be lightweight. Herein lies the interdependency, in that as the vehicle's speed increases, the TPS requirements are increased. And as TPS masses increase the effect on the propulsion system and all other systems is compounded. To adequately determine insulation masses for a vehicle such as the one described above, the aeroheating loads must be calculated and the TPS thicknesses must be calculated for the entire vehicle. To accomplish this an ascent or reentry trajectory is obtained using the computer code Program to Optimize Simulated Trajectories (POST). The trajectory is then used to calculate the convective heat rates on several locations on the vehicles using the Miniature Version of the JA70 Aerodynamic Heating Computer Program (MINIVER). Once the heat rates are defined for each body point on the vehicle, then insulation thickness that are required to maintain the vehicle within structural limits are calculated using Systems Improved Numerical Differencing Analyzer (SINDA) models. If the TPS masses are too heavy for the performance of the vehicle the process may be repeated altering the trajectory or some other input to reduce

  5. Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System

    Science.gov (United States)

    Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.

    2010-01-01

    The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.

  6. MOSES - A modelling tool for the analysis of scenarios of the European electricity supply system

    Science.gov (United States)

    Weitemeyer, S.; Feck, T.; Agert, C.

    2012-10-01

    Recent studies have shown that a transition of the current power supply system in Europe to a system almost entirely based on fluctuating Renewable Energy Sources (RES) by mid-century is possible. However, most of these scenarios require a significant amount of back-up power capacities to ensure the security of electricity supply. This would imply high additional investments and operating costs. Hence, alternative options should be investigated first. Here we present a first outlook of our simulation model MOSES which will be able to analyse different target states of the European electricity system in 2050. In this model long-term meteorological data series are used to optimise the capacity mix of RES in Europe. One of the main elements of our tool is a simplified electricity network. In addition, alternative options for reduction of additional back-up power like the expansion of the transmission grid, the use of demand-side management and/or the installation of over-capacities will be implemented. The results will be used to provide scientifically proven recommendations to policy makers for a reliable energy supply system in Europe based on Renewable Energy Sources.

  7. Shot Planning and Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  8. Trigonometric regressive spectral analysis: an innovative tool for evaluating the autonomic nervous system.

    Science.gov (United States)

    Ziemssen, Tjalf; Reimann, Manja; Gasch, Julia; Rüdiger, Heinz

    2013-09-01

    Biological rhythms, describing the temporal variation of biological processes, are a characteristic feature of complex systems. The analysis of biological rhythms can provide important insights into the pathophysiology of different diseases, especially, in cardiovascular medicine. In the field of the autonomic nervous system, heart rate variability (HRV) and baroreflex sensitivity (BRS) describe important fluctuations of blood pressure and heart rate which are often analyzed by Fourier transformation. However, these parameters are stochastic with overlaying rhythmical structures. R-R intervals as independent variables of time are not equidistant. That is why the trigonometric regressive spectral (TRS) analysis--reviewed in this paper--was introduced, considering both the statistical and rhythmical features of such time series. The data segments required for TRS analysis can be as short as 20 s allowing for dynamic evaluation of heart rate and blood pressure interaction over longer periods. Beyond HRV, TRS also estimates BRS based on linear regression analyses of coherent heart rate and blood pressure oscillations. An additional advantage is that all oscillations are analyzed by the same (maximal) number of R-R intervals thereby providing a high number of individual BRS values. This ensures a high confidence level of BRS determination which, along with short recording periods, may be of profound clinical relevance. The dynamic assessment of heart rate and blood pressure spectra by TRS allows a more precise evaluation of cardiovascular modulation under different settings as has already been demonstrated in different clinical studies. PMID:23812502

  9. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    Science.gov (United States)

    Lausberg, Hedda; Sloetjes, Han

    2016-09-01

    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods. PMID:26428913

  10. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    Science.gov (United States)

    Lausberg, Hedda; Sloetjes, Han

    2016-09-01

    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.

  11. State-of-the-art Tools and Techniques for Quantitative Modeling and Analysis of Embedded Systems

    DEFF Research Database (Denmark)

    Bozga, Marius; David, Alexandre; Hartmanns, Arnd;

    2012-01-01

    This paper surveys well-established/recent tools and techniques developed for the design of rigorous embedded sys- tems. We will first survey U PPAAL and M ODEST, two tools capable of dealing with both timed and stochastic aspects. Then, we will overview the BIP framework for modular design and c...

  12. On the use of financial analysis tools for the study of Dst time series in the frame of complex systems

    CERN Document Server

    Potirakis, Stelios M; Balasis, Georgios; Eftaxias, Konstantinos

    2016-01-01

    Technical analysis is considered the oldest, currently omnipresent, method for financial markets analysis, which uses past prices aiming at the possible short-term forecast of future prices. In the frame of complex systems, methods used to quantitatively analyze specific dynamic phenomena are often used to analyze phenomena from other disciplines on the grounds that are governed by similar dynamics. An interesting task is the forecast of a magnetic storm. The hourly Dst is used as a global index for the monitoring of Earth's magnetosphere, which could be either in quiet (normal) or in magnetic storm (pathological) state. This work is the first attempt to apply technical analysis tools on Dst time series, aiming at the identification of indications which could be used for the study of the temporal evolution of Earth's magnetosphere state. We focus on the analysis of Dst time series around the occurrence of magnetic storms, discussing the possible use of the resulting information in the frame of multidisciplina...

  13. Performance assessment of the Tactical Network Analysis and Planning System Plus (TNAPS+) automated planning tool for C4I systems

    OpenAIRE

    Ziegenfuss, Paul C.

    1999-01-01

    The Joint Staff established the Tactical Network Analysis and Planning System Plus (TNAPS+) as the interim joint communications planning and management system. The Marines Command and Control Systems Course and the Army's Joint Task Force System Planning Course both utilize TNAPS+ to conduct tactical C41 network planning in their course requirements. This thesis is a Naval Postgraduate School C41 curriculum practical application of TNAPS+ in an expeditionary Joint Task Force environment, focu...

  14. Ground Data System Analysis Tools to Track Flight System State Parameters for the Mars Science Laboratory (MSL) and Beyond

    Science.gov (United States)

    Allard, Dan; Deforrest, Lloyd

    2014-01-01

    Flight software parameters enable space mission operators fine-tuned control over flight system configurations, enabling rapid and dynamic changes to ongoing science activities in a much more flexible manner than can be accomplished with (otherwise broadly used) configuration file based approaches. The Mars Science Laboratory (MSL), Curiosity, makes extensive use of parameters to support complex, daily activities via commanded changes to said parameters in memory. However, as the loss of Mars Global Surveyor (MGS) in 2006 demonstrated, flight system management by parameters brings with it risks, including the possibility of losing track of the flight system configuration and the threat of invalid command executions. To mitigate this risk a growing number of missions have funded efforts to implement parameter tracking parameter state software tools and services including MSL and the Soil Moisture Active Passive (SMAP) mission. This paper will discuss the engineering challenges and resulting software architecture of MSL's onboard parameter state tracking software and discuss the road forward to make parameter management tools suitable for use on multiple missions.

  15. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  16. NCC: A Physics-Based Design and Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    2000-01-01

    The National Combustion Code (NCC) is an integrated system of computer codes for physics-based design and analysis of combustion systems. It uses unstructured meshes and runs on parallel computing platforms. The NCC is composed of a set of distinct yet closely related modules. They are: (1) a gaseous flow module solving 3-D Navier-Stokes equations; (2) a turbulence module containing the non-linear k-epsilon models; (3) a chemistry module using either the conventional reduced kinetics approach of solving species equations or the Intrinsic Low Dimensional Manifold (ILDM) kinetics approach of table looking up in conjunction with solving the equations of the progressive variables; (4) a turbulence-chemistry interaction module including the option of solving the joint probability density function (PDF) for species and enthalpy; and (5) a spray module for solving the liquid phase equations. In early 1995, an industry-government team was formed to develop the NCC. In July 1998, the baseline beta version was completed and presented in two NCC sessions at the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, July 1998. An overview of this baseline beta version was presented at the NASA HPCCP/CAS Workshop 98, August 1998. Since then, the effort has been focused on the streamlining, validation, and enhancement of the th baseline beta version. The progress is presented in two NCC sessions at the AIAA 38 Aerospace Sciences Meeting & Exhibit, January 2000. At this NASA HPCCP/CAS Workshop 2000, an overview of the NCC papers presented at the AIAA 38 th Aerospace Sciences Meeting & Exhibit is presented, with emphasis on the reduction of analysis time of simulating the (gaseous) reacting flows in full combustors. In addition, results of NCC simulation of a modern turbofan combustor will also be reported.

  17. The System Cost Model: A tool for life cycle cost and risk analysis

    International Nuclear Information System (INIS)

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors began development of the System Cost Model (SCM) application. The SCM estimates life cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, and transuranic waste. The SCM uses parametric cost functions to estimate life cycle costs for various treatment, storage, and disposal modules which reflect planned and existing waste management facilities at DOE installations. In addition, SCM can model new TSD facilities based on capacity needs over the program life cycle. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction, operations and maintenance, and decommissioning these waste management facilities. The SCM also provides transportation costs for DOE wastes. Transportation costs are provided for truck and rail and include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. A complement to the SCM is the System Cost Model-Risk (SCM-R) model, which provides relative Environmental, Safety, and Health (ES and H) risk information. A relative ES and H risk basis has been developed and applied by LITCO at the INEL. The risk basis is now being automated in the SCM-R to facilitate rapid risk analysis of system alternatives. The added risk functionality will allow combined cost and risk evaluation of EM alternatives

  18. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    Science.gov (United States)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  19. AN ANALYSIS OF THE ELECTROMAGNETIC PROCESSES IN THE INDUCTOR SYSTEMTOOL OF THE STRAIGHTENING OF CAR BODIES

    Directory of Open Access Journals (Sweden)

    Yu. V. Batygin

    2015-04-01

    Full Text Available Introduction. One of the promising directions of electromagnetic forming (EMF is a contactless magnetic-pulse straightening of the automobile bodies. The efficiency and the quality of the straightening depend on design and operating principle of the straightening tool. In the modern technique of EMF a large number of the tools - inductor systems (IS is used in different configurations with uneven distribution forces on the treatment object that in turn does not meet the needs of the effective process of straightening. There appears the urgent necessity to create IS with high uniformity of the induced field and a high concentration of attracting forces in the working area of the tool. The most effective IS are the Inductor Systems with an Attracting Screen (ISAS. One of the most important considerations when choosing a particular design ISAS is the study of the electrodynamics processes with definition of excited loads. The nature and the course of the electrodynamics processes in accordance with design features determine the effectiveness and the efficiency of the ISAS. Therefore, in ISAS an additional coil for the concentration of the attracting forces in the working area should be entered. Purpose. The numerical analysis of the induced fields and the currents in the experimental models of ISAS with an additional coil was made. Methodology. In the idealization of the «extremely low» frequencies of existing fields, there were received rated dependences for density of the induced currents and distributed attracting force in ISAS and the external additional coil, through the use of the calculated model in the cylindrical coordinate system. Results. Insertion of the additional coil placed over the accessory screen allows to concentrate and increase the amplitude of the attracting forces in the central part of the working area of the inductor system. Practical value. 1. Numerical analysis of fields and currents in experimental models of Induction

  20. Understanding Earthquake Fault Systems Using QuakeSim Analysis and Data Assimilation Tools

    Science.gov (United States)

    Donnellan, Andrea; Parker, Jay; Glasscoe, Margaret; Granat, Robert; Rundle, John; McLeod, Dennis; Al-Ghanmi, Rami; Grant, Lisa

    2008-01-01

    We are using the QuakeSim environment to model interacting fault systems. One goal of QuakeSim is to prepare for the large volumes of data that spaceborne missions such as DESDynI will produce. QuakeSim has the ability to ingest distributed heterogenous data in the form of InSAR, GPS, seismicity, and fault data into various earthquake modeling applications, automating the analysis when possible. Virtual California simulates interacting faults in California. We can compare output from long time history Virtual California runs with the current state of strain and the strain history in California. In addition to spaceborne data we will begin assimilating data from UAVSAR airborne flights over the San Francisco Bay Area, the Transverse Ranges, and the Salton Trough. Results of the models are important for understanding future earthquake risk and for providing decision support following earthquakes. Improved models require this sensor web of different data sources, and a modeling environment for understanding the combined data.

  1. OCAM - A CELSS modeling tool: Description and results. [Object-oriented Controlled Ecological Life Support System Analysis and Modeling

    Science.gov (United States)

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray

    1992-01-01

    Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.

  2. Analysis of simulation tools for the study of advanced marine power systems

    OpenAIRE

    Brochard, Paul Eugene

    1992-01-01

    The United States Navy is at a crossroads in the design of ship's engineering plants. Advances in solid-state power electronics combined with a shift to gas turbine powered propulsion and electric plants has placed renewed emphasis on developing advanced power systems. These advanced power systems may combine the prime movers associated with propulsion and electric power generation into an integrated system. The development of advanced electric distribution systems and propulsion derived ship...

  3. Real-time systems design and analysis tools for the practitioner

    CERN Document Server

    Laplante, Phillip A

    2012-01-01

    An important resource, this book offers an introduction and overview of real-time systems: systems where timeliness is a crucial part of the correctness of the system. It contains a pragmatic overview of key topics (computer architecture and organization, operating systems, software engineering, programming languages, and compiler theory) from the perspective of the real-time systems designer and is organized into chapters that are essentially self-contained. In addition, each chapter contains both basic and more challenging exercises that will help the reader to confront actual problems.

  4. Bifurcation Tools for Flight Dynamics Analysis and Control System Design Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the project is the development of a computational package for bifurcation analysis and advanced flight control of aircraft. The development of...

  5. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems

    International Nuclear Information System (INIS)

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives

  6. Advanced Numerical Tools for Design and Analysis of In-Space, Valve and Feed Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In-space valves for the main fuel and oxidizer feed systems are required to provide precise control, wide throttling range and handle rapid on-off control. These...

  7. Characterization of components of water supply systems from GPR images and tools of intelligent data analysis.

    OpenAIRE

    Ayala Cabrera, David

    2015-01-01

    [EN] Over time, due to multiple operational and maintenance activities, the networks of water supply systems (WSSs) undergo interventions, modifications or even are closed. In many cases, these activities are not properly registered. Knowledge of the paths and characteristics (status and age, etc.) of the WSS pipes is obviously necessary for efficient and dynamic management of such systems. This problem is greatly augmented by considering the detection and control of leaks. Access to reliable...

  8. Analysis techniques for multivariate root loci. [a tool in linear control systems

    Science.gov (United States)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  9. Microfluidic housing system: a useful tool for the analysis of dye-sensitized solar cell components

    Science.gov (United States)

    Sacco, A.; Lamberti, A.; Pugliese, D.; Chiodoni, A.; Shahzad, N.; Bianco, S.; Quaglio, M.; Gazia, R.; Tresso, E.; Pirri, C. F.

    2012-11-01

    In order to understand the behavior of the different dye-sensitized solar cell (DSC) components, an in-situ analysis should give fundamental help but it is impossible to be performed without compromising the integrity of the cell. Our recently proposed novel microfluidic approach for the fabrication of DSCs is based on a reversible sealing of the two transparent electrodes and it allows the easy assembling and disassembling of the cell, making possible an analysis of the components over time. The aim of this work is not to investigate the different degradation mechanisms of a standard DSC: we want to show that, by using a microfluidic architecture, it is possible to perform a non-destructive analysis and to monitor the photoanode and the counter electrode properties during their lifetime. Morphological (field emission scanning electron microscopy), wetting (contact angle), optical (UV-visible spectroscopy) and electrical (current-voltage and electrochemical impedance spectroscopy measurements under standard AM1.5G illumination) characterizations have been performed over a period of three weeks. The results show how the variation of the wetting and morphological properties at the counter electrode and of the dye absorbance at the photoanode are strongly related to the decrease of the cell performances as evidenced by electrical characterization, thus demonstrating the effectiveness of the use of our structure in this kind of studies.

  10. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  11. Common Analysis Tool Being Developed for Aeropropulsion: The National Cycle Program Within the Numerical Propulsion System Simulation Environment

    Science.gov (United States)

    Follen, Gregory J.; Naiman, Cynthia G.

    1999-01-01

    The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.

  12. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  13. Geographic Information System and tools of spatial analysis in a pneumococcal vaccine trial

    Directory of Open Access Journals (Sweden)

    Tanskanen Antti

    2012-01-01

    Full Text Available Abstract Background The goal of this Geographic Information System (GIS study was to obtain accurate information on the locations of study subjects, road network and services for research purposes so that the clinical outcomes of interest (e.g., vaccine efficacy, burden of disease, nasopharyngeal colonization and its reduction could be linked and analyzed at a distance from health centers, hospitals, doctors and other important services. The information on locations can be used to investigate more accurate crowdedness, herd immunity and/or transmission patterns. Method A randomized, placebo-controlled, double-blind trial of an 11-valent pneumococcal conjugate vaccine (11PCV was conducted in Bohol Province in central Philippines, from July 2000 to December 2004. We collected the information on the geographic location of the households (N = 13,208 of study subjects. We also collected a total of 1982 locations of health and other services in the six municipalities and a comprehensive GIS data over the road network in the area. Results We calculated the numbers of other study subjects (vaccine and placebo recipients, respectively within the neighborhood of each study subject. We calculated distances to different services and identified the subjects sharing the same services (calculated by distance. This article shows how to collect a complete GIS data set for human to human transmitted vaccine study in developing country settings in an efficient and economical way. Conclusions The collection of geographic locations in intervention trials should become a routine task. The results of public health research may highly depend on spatial relationships among the study subjects and between the study subjects and the environment, both natural and infrastructural. Trial registration number ISRCTN: ISRCTN62323832

  14. Tools for income mobility analysis

    OpenAIRE

    Philippe Kerm

    2002-01-01

    A set of Stata routines to help analysis of `income mobility' are presented and illustrated. Income mobility is taken here as the pattern of income change from one time period to another within an income distribution. Multiple approaches have been advocated to assess the magnitude of income mobility. The macros presented provide tools for estimating several measures of income mobility, e.g. the Shorrocks (JET 1978) or King (Econometrica 1983) indices or summary statistics for transition matri...

  15. User's Guide and Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information

    Science.gov (United States)

    ABSTRACTUser’s Guide & Metadata to Coastal Biodiversity Risk Analysis Tool (CBRAT): Framework for the Systemization of Life History and Biogeographic Information(EPA/601/B-15/001, 2015, 123 pages)Henry Lee II, U.S. EPA, Western Ecology DivisionKatharine Marko, U.S. EPA,...

  16. A Simulation of Energy Storage System for Improving the Power System Stability with Grid-Connected PV using MCA Analysis and LabVIEW Tool

    Directory of Open Access Journals (Sweden)

    Jindrich Stuchly

    2015-01-01

    Full Text Available The large-scale penetration of distributed, Renewable power plants require transfers of large amounts of energy. This, in turn, puts a high strain on the energy delivery infrastructure. In particular, photovoltaic power plants supply energy with high intermittency, possibly affecting the stability of the grid by changing the voltage at the plant connection point. In this contribution, we summarize the main negative effects of selected and real-operated grid connected photovoltaic plant. Thereafter a review of suitable Energy storage systems to mitigate the negative effects has been carried out, compared and evaluated using Multi-criterion analysis. Based on this analysis, data collected at the plant and the grid, are used to design the energy storage systems to support connection of the plant to the grid. The cooperation of these systems is then analysed and evaluated using simulation tools created in LabVIEW for this purpose. The simulation results demonstrate the capability of energy storage system solutions to significantly reduce the negative feedback effects of Photovoltaic Power Plan to the low voltage grid.

  17. From sensor networks to connected analysis tools

    Science.gov (United States)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  18. Analysis of Effect on Clipping Mechanism and Structure of Tool System for High-speed Machine Tool%高速机床工具系统结构及其夹紧特性的研究

    Institute of Scientific and Technical Information of China (English)

    张志梅; 安虎平; 王锐锋; 芮志元

    2012-01-01

    针对高速机床对工具系统的要求和实际存在的问题,探讨了工具系统刀柄截面形状和结构选择的关键技术,确定合理的定位方案为端面锥柄定位,刀柄采用小锥度的空心短锥结构.结果表明,HSK刀柄定位准确、连接可靠、系统刚度高、具有良好的抗振性及结构工艺性.通过对工具夹紧机构受力的特性分析和放大系数的计算,给出了影响夹紧力和夹紧效果的因素及其变化规律,可为工具系统的设计提供参考.%Aiming at the requirement of high-speed machine tool for tool system and the existing question, it treats the key technique for opting the section shapes and structure of tool handle in high speed tool system. The best project of locating is the iteration location mode with end - plane and taper at the same time, and with the tool handle whose structure uses short hollow cone with small taper. Experiment shows that this structure is able to guarantee accurate location, reliable collection, high rigidity of system, good anti - vibration and structural property to process. Through analysis of characteristic and calculation of magnification coefficient for force in clamp mechanism, it presents the factors and their variable laws that affect force and effect of clamping tool, which can be refered for design of tool system.

  19. A Tool and Methodology for AC-Stability Analysis of Continuous-Time Closed-Loop Systems

    CERN Document Server

    Milev, Momchil

    2011-01-01

    Presented are a methodology and a DFII-based tool for AC-stability analysis of a wide variety of closed-loop continuous-time (operational amplifiers and other linear circuits). The methodology used allows for easy identification and diagnostics of ac-stability problems including not only main-loop effects but also local-instability loops in current mirrors, bias circuits and emitter or source followers without breaking the loop. The results of the analysis are easy to interpret. Estimated phase margin is readily available. Instability nodes and loops along with their respective oscillation frequencies are immediately identified and mapped to the existing circuit nodes thus offering significant advantages compared to traditional "black-box" methods of stability analysis (Transient Overshoot, Bode and Phase margin plots etc.). The tool for AC-Stability analysis is written in SKILL? and is fully integrated in DFII? environment. Its "push-button" graphical user interface (GUI) is easy to use and understand. The t...

  20. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  1. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  2. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... are specified using the temporal logic Probabilistic Computation Tree Logic (PCTL) and we employ stochastic model checking, by means of the model checker PRISM, to compute their exact values. We present a simplified example of a distributed stochastic system where we determine a reachability property...

  3. Flow Injection/Sequential Injection Analysis Systems: Potential Use as Tools for Rapid Liver Diseases Biomarker Study

    Directory of Open Access Journals (Sweden)

    Supaporn Kradtap Hartwell

    2012-01-01

    Full Text Available Flow injection/sequential injection analysis (FIA/SIA systems are suitable for carrying out automatic wet chemical/biochemical reactions with reduced volume and time consumption. Various parts of the system such as pump, valve, and reactor may be built or adapted from available materials. Therefore the systems can be at lower cost as compared to other instrumentation-based analysis systems. Their applications for determination of biomarkers for liver diseases have been demonstrated in various formats of operation but only a few and limited types of biomarkers have been used as model analytes. This paper summarizes these applications for different types of reactions as a guide for using flow-based systems in more biomarker and/or multibiomarker studies.

  4. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  5. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  6. Control system for borehole tools

    Energy Technology Data Exchange (ETDEWEB)

    Bordon, E.E.

    1987-03-10

    A control assembly is described for use with a tool including one or more subassemblies adapted for controlling and/or monitoring various events within a borehole and actuating instrumentation positioned on the earth's surface for actuating the tool. The assembly comprises: control means connected to the tool for selectively actuating one or more of the subassemblies within the tool, the control means being adapted for operation within the borehole, power supply means connected to the tool for supplying electrical power to the control means for operation thereof independent of the surface actuating instrumentation, communication means connected to the surface actuating instrumentation for communicating therewith, and connection means for selectively connecting the communication means to the control means while the tool and the control means connected thereto are within the borehole to establish communication between the control means and the surface actuating instrumentation. The connection means is adapted for operation within the borehole.

  7. Simulating the Farm Production System Using the MONARC Simulation Tool

    Institute of Scientific and Technical Information of China (English)

    Y.Wu; I.C.Legrand; 等

    2001-01-01

    The simulation program developed by the "Models of Networked Analysis at Regional Centers"(MONARC) project is a powerful and flexible tool for simulating the behavior of large scale distributed computing systems,In this study,we further validate this simulation tool in a large-scale distributed farm computing system.We also report the usage of this simulation tool to identify the bottlenecks and limitations of our farm system.

  8. System of Objectified Judgement Analysis (SOJA) as a tool in rational and transparent drug-decision making.

    Science.gov (United States)

    Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob

    2007-10-01

    Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.

  9. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    : it is highly efficient at processing the water resource, but the rigid and almost linear structure makes it vulnerable in situations of stress such as heavy rain events. The analysis of future scenarios showed a trend towards increased sustainability, but differences between past and expected future......New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices...

  10. Functional analysis, a resilience improvement tool applied to a waste management system – application to the "household waste management chain"

    Directory of Open Access Journals (Sweden)

    H. Beraud

    2012-12-01

    Full Text Available A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site.


    1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.. These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005.

  11. Functional analysis, a resilience improvement tool applied to a waste management system - application to the "household waste management chain"

    Science.gov (United States)

    Beraud, H.; Barroca, B.; Hubert, G.

    2012-12-01

    A waste management system plays a leading role in the capacity of an area to restart after flooding, as their impact on post-crisis management can be very considerable. Improving resilience, i.e. enabling it to maintain or recover acceptable operating levels after flooding is primordial. To achieve this, we must understand how the system works for bringing any potential dysfunctions to light and taking preventive measures. Functional analysis has been used for understanding the complexity of this type of system. The purpose of this article is to show the interest behind this type of method and the limits in its use for improving resilience of waste management system as well as other urban technical systems1, by means of theoretical modelling and its application on a study site. 1In a systemic vision of the city, urban technical systems combine all the user service systems that are essential for the city to operate (electricity, water supplies, transport, sewerage, etc.). These systems are generally organised in the form of networks (Coutard, 2010; CERTU, 2005).

  12. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  13. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  14. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  15. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  16. Implementation of cutting tool management system

    Directory of Open Access Journals (Sweden)

    G. Svinjarević

    2007-07-01

    Full Text Available Purpose: of this paper is to show the benefits of implementation of management of cutting tools in the company which specializes in metal cutting process, after which the production conditions alows new possibilities for improvement of the tool management.Design/methodology/approach: applied in this paper was identification current state and exploatation conditions of cutting tools on lathes and milling machines and organization of the departments and other services, which are directly involved in the cutting tools management system.Findings: of the controlled testings and analyses in every phase of tool management in departments and other services which are directly involved in the tool management system will help to reduce stock and costs. It is possible to identify which operator makes errors and is responsible for inappropriate use of cutting tool. Some disadvantages have been identified and a few suggestions for the improvement in the tool management system have been given. A result of research is easy to apply in company with developed informatic infrastructure and is mostly interesting for CNC workshops. Small companies and specialized low volume productions have to made additional effort to integrate in clusters.Practical implications: are reduction of cutting tool on stock, reduction of employee, quick access to the necessary cutting tools and data, simplicity in tool order and supply. The most important is possibility to monitor and to identify which cutting tools and employees are weakest parts of chain in tool management system. Management activity should be foreseeable in all its segments, which includes both the appropriate choice and use of cutting tools, and monitoring of unwanted phenomena during the cutting process and usage of these data for further purchase of tools.Originality/value: in the paper is turnover methodology applied for determination of management efficacy and formation of employees from different departments in

  17. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  18. Multi-mission telecom analysis tool

    Science.gov (United States)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  19. A Comparative Analysis of Life-Cycle Assessment Tools for End-of-Life Materials Management Systems

    Science.gov (United States)

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal s...

  20. Modelling of safety fieldbus system via SW tool SHARPE

    OpenAIRE

    Maria Franekova; Jan Rofar

    2008-01-01

    Paper deals with the modelling of the safety-related Fieldbus communication system, which has to guaranty Safety Integrity Level (SIL) according to standard IEC 61508. There are methods of safety analysis for the closed safety Fieldbus transmission system summarized. The mainly part the modeling SW tool SHARPE describes. The realized models are based on Fault Tree Analysis (FTA) and Markov analysis.

  1. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    Science.gov (United States)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  2. Systems Prototyping with Fourth Generation Tools.

    Science.gov (United States)

    Sholtys, Phyllis

    1983-01-01

    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  3. ISHM Decision Analysis Tool: Operations Concept

    Science.gov (United States)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  4. General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure,...

  5. Design of Fault Analysis and Diagnosis System in NC Machine Tool%数控机床故障分析与诊断系统的设计

    Institute of Scientific and Technical Information of China (English)

    欧敏

    2013-01-01

    In this typical faults of CNC machine tools, through careful analysis and research, find out the fault and the relationship between certain signal characteristics, and on the basis of the design of fault diagnosis system, presents a diagnostic system of soft, hardware design, failure to effectively forecast and diagnosis. In practice the effective for NC machine tool maintenance, ensure the normal operation of CN machine tools.%针对数控机床的典型故障,进行认真分析和研究,找出典型故障与某些信号特征间的关系,并在此基础上设计出故障诊断系统,提出了诊断系统的软、硬件设计方案,对故障进行有效的预测与诊断。在实际操作中对数控机床进行有效维护,保证了数控机床的正常运行。

  6. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  7. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  8. Geographical information system (GIS) as a new tool to evaluate epidemiology based on spatial analysis and clinical outcomes in acromegaly.

    Science.gov (United States)

    Naves, Luciana Ansaneli; Porto, Lara Benigno; Rosa, João Willy Corrêa; Casulari, Luiz Augusto; Rosa, José Wilson Corrêa

    2015-02-01

    Geographical information systems (GIS) have emerged as a group of innovative software components useful for projects in epidemiology and planning in Health Care System. This is an original study to investigate environmental and geographical influences on epidemiology of acromegaly in Brazil. We aimed to validate a method to link an acromegaly registry with a GIS mapping program, to describe the spatial distribution of patients, to identify disease clusters and to evaluate if the access to Health Care could influence the outcome of the disease. Clinical data from 112 consecutive patients were collected and home addresses were plotted in the GIS software for spatial analysis. The buffer spatial distribution of patients living in Brasilia showed that 38.1% lived from 0.33 to 8.66 km, 17.7% from 8.67 to 18.06 km, 22.2% from 18.07 to 25.67 km and 22% from 25.68 to 36.70 km distant to the Reference Medical Center (RMC), and no unexpected clusters were identified. Migration of 26 patients from 11 others cities in different regions of the country was observed. Most of patients (64%) with adenomas bigger than 25 mm lived more than 20 km away from RMC, but no significant correlation between the distance from patient's home to the RMC and tumor diameter (r = 0.45 p = 0.20) nor for delay in diagnosis (r = 0.43 p = 0.30) was found. The geographical distribution of diagnosed cases did not impact in the latency of diagnosis or tumor size but the recognition of significant migration denotes that improvements in the medical assistance network are needed.

  9. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  10. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  11. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    Directory of Open Access Journals (Sweden)

    Tilton Susan C

    2012-11-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single

  12. Marine Machinery Systems - Tools and Architecture

    OpenAIRE

    Sandbakken, Egil Christoffer

    2010-01-01

    The thesis presents tools and architecture regarding design of marine MSs in OSVs. It enlightens important aspects regarding the design based on a research study, and proposes a design methodology consisting of tools and architecture.From the research studies in chapter 2 it becomes clear that the most common propulsion system today for platform supply vessels (PSV) is the diesel-electric (DEL) propulsion system. Other concepts such as; dual fuel engines, Voith Schneider Propellers (VSP), hyb...

  13. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  14. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  15. The second iteration of the Systems Prioritization Method: A systems prioritization and decision-aiding tool for the Waste Isolation Pilot Plant: Volume 3, Analysis for final programmatic recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Prindle, N.H.; Boak, D.M.; Weiner, R.F. [and others

    1996-05-01

    Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories for the US DOE Carlsbad Area Office (DOE/CAO). This tool provides an analytical basis for programmatic decision making for the Waste Isolation Pilot Plant (WIPP). SPM integrates decision-analysis techniques, performance,a nd risk-assessment tools, and advanced information technology. Potential outcomes of proposed activities and combination of activities are used to calculate a probability of demonstrating compliance (PDC) with selected regulations. The results are presented in a decision matrix showing cost, duration, and maximum PDC for all activities in a given cost and duration category. This is the third and final volume in the series which presents the analysis for final programmatic recommendations.

  16. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  17. Geographical information system (GIS) as a new tool to evaluate epidemiology based on spatial analysis and clinical outcomes in acromegaly

    OpenAIRE

    Naves, Luciana Ansaneli; Porto, Lara Benigno; Rosa, João Willy Corrêa; Casulari, Luiz Augusto; Rosa, José Wilson Corrêa

    2013-01-01

    Geographical information systems (GIS) have emerged as a group of innovative software components useful for projects in epidemiology and planning in Health Care System. This is an original study to investigate environmental and geographical influences on epidemiology of acromegaly in Brazil. We aimed to validate a method to link an acromegaly registry with a GIS mapping program, to describe the spatial distribution of patients, to identify disease clusters and to evaluate if the access to Hea...

  18. Systems Thinking Tools for Improving Evidence-Based Practice: A Cross-Case Analysis of Two High School Leadership Teams

    Science.gov (United States)

    Kensler, Lisa A. W.; Reames, Ellen; Murray, John; Patrick, Lynne

    2012-01-01

    Teachers and administrators have access to large volumes of data but research suggests that they lack the skills to use data effectively for continuous school improvement. This study involved a cross-case analysis of two high school leadership teams' early stages of evidence-based practice development; differing forms of external support were…

  19. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  20. Harnessing VLSI System Design with EDA Tools

    CERN Document Server

    Kamat, Rajanish K; Gaikwad, Pawan K; Guhilot, Hansraj

    2012-01-01

    This book explores various dimensions of EDA technologies for achieving different goals in VLSI system design. Although the scope of EDA is very broad and comprises diversified hardware and software tools to accomplish different phases of VLSI system design, such as design, layout, simulation, testability, prototyping and implementation, this book focuses only on demystifying the code, a.k.a. firmware development and its implementation with FPGAs. Since there are a variety of languages for system design, this book covers various issues related to VHDL, Verilog and System C synergized with EDA tools, using a variety of case studies such as testability, verification and power consumption. * Covers aspects of VHDL, Verilog and Handel C in one text; * Enables designers to judge the appropriateness of each EDA tool for relevant applications; * Omits discussion of design platforms and focuses on design case studies; * Uses design case studies from diversified application domains such as network on chip, hospital on...

  1. Interval analysis on non-linear monotonic systems as an efficient tool to optimise fresh food packaging

    OpenAIRE

    Destercke, Sebastien; Guillard, Valérie

    2011-01-01

    International audience; When few data or information are available, the validity of studies performing uncertainty analysis or robust design optimisation (i.e., parameter optimisation under uncertainty) with a probabilistic approach is questionable. This is particularly true in some agronomical fields, where parameter and variable uncertainties are often quantified by a handful of measurements or by expert opinions. In this paper, we propose a simple alternative approach based on interval ana...

  2. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  3. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  4. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  5. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    OpenAIRE

    Mihaela-Laura IVAN; Mircea Raducu TRIFU; Manole VELICANU; Cristian CIUREA

    2016-01-01

    The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In deta...

  6. A computer aided engineering tool for ECLS systems

    Science.gov (United States)

    Bangham, Michal E.; Reuter, James L.

    1987-01-01

    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  7. Cutting tool form compensation system and method

    Science.gov (United States)

    Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.

    1993-10-19

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.

  8. Cutting tool form compensaton system and method

    Science.gov (United States)

    Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.

    1993-01-01

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.

  9. [SIGAPS, a tool for the analysis of scientific publications].

    Science.gov (United States)

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  10. MESS (Multi-purpose Exoplanet Simulation System): A Monte Carlo tool for the statistical analysis and prediction of exoplanets search results

    CERN Document Server

    Bonavita, M; Desidera, S; Gratton, R; Janson, M; Beuzit, J L; Kasper, M; Mordasini, C

    2011-01-01

    The high number of planet discoveries made in the last years provides a good sample for statistical analysis, leading to some clues on the distributions of planet parameters, like masses and periods, at least in close proximity to the host star. We likely need to wait for the extremely large telescopes (ELTs) to have an overall view of the extrasolar planetary systems. In this context it would be useful to have a tool that can be used for the interpretation of the present results,and also to predict what the outcomes would be of the future instruments. For this reason we built MESS: a Monte Carlo simulation code which uses either the results of the statistical analysis of the properties of discovered planets, or the results of the planet formation theories, to build synthetic planet populations fully described in terms of frequency, orbital elements and physical properties. They can then be used to either test the consistency of their properties with the observed population of planets given different detectio...

  11. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh;

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  12. Two energy system analysis - cases

    DEFF Research Database (Denmark)

    Lund, Henrik; Antonoff, Jayson; Andersen, Anders N.

    2004-01-01

    The chapter presents tow cases of energy system analysis, illustrating the types of tools and methodologies presently being used for these studies in Denamrk and elsewhere.......The chapter presents tow cases of energy system analysis, illustrating the types of tools and methodologies presently being used for these studies in Denamrk and elsewhere....

  13. Design tools for complex dynamic security systems.

    Energy Technology Data Exchange (ETDEWEB)

    Byrne, Raymond Harry; Rigdon, James Brian; Rohrer, Brandon Robinson; Laguna, Glenn A.; Robinett, Rush D. III (.; ); Groom, Kenneth Neal; Wilson, David Gerald; Bickerstaff, Robert J.; Harrington, John J.

    2007-01-01

    The development of tools for complex dynamic security systems is not a straight forward engineering task but, rather, a scientific task where discovery of new scientific principles and math is necessary. For years, scientists have observed complex behavior but have had difficulty understanding it. Prominent examples include: insect colony organization, the stock market, molecular interactions, fractals, and emergent behavior. Engineering such systems will be an even greater challenge. This report explores four tools for engineered complex dynamic security systems: Partially Observable Markov Decision Process, Percolation Theory, Graph Theory, and Exergy/Entropy Theory. Additionally, enabling hardware technology for next generation security systems are described: a 100 node wireless sensor network, unmanned ground vehicle and unmanned aerial vehicle.

  14. Decision Making Support in Wastewater Management: Comparative analysis of techniques and tools used in centralized and decentralized system layouts UDK 628.2

    Directory of Open Access Journals (Sweden)

    Harmony Musiyarira

    2012-02-01

    Full Text Available Wastewater management has been seen primarily as a technical and economic issue but it is now recognised that these are some of the elements in an array of other factors that affect sustainability of wastewater systems. Literature studies point out that municipal authorities have a general and long-standing tradition of using indicators in monitoring performance, reviewing progress and reporting the state of the environment as part of the regulatory enacted compliance. However, they have neglected other critical aspects of use of these indicators such as their input into the planning and decision making process. This research advocates for the use of sustainable indicators in a context based planning approach and the utilisation of Multi Criteria Decision Aid (MCDA in a two step approach for comparative analysis and assessment of the sustainability of wastewater systems. The overall objective was to develop a methodology for wastewater systems selection and to produce a practical planning tool to aid in decision making for municipalities. Another objective was to provide recommendations for wastewater and sanitation management improvement in the case study area. The methodology consisted of comprehensive literature review, case study analysis, a review of the Decision Support Systems (DSS in use and the development of the DSS for Gauteng Province. The full spectrum of viable wastewater or sanitation options was incorporated into the DSS. From the sustainability assessments carried out using Multi criteria decision analysis, one result showed that varying degrees of sustainability are obtainable with each treatment technology involved and decentralised technologies appear more sustainable. Based on the local context and indicators used in this research, the DSS results suggest that land treatment systems, stabilisation ponds and ecological treatment methods are more sustainable. One major finding from literature is that no technology is

  15. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  16. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  17. A tool for subjective analysis of TTOs

    OpenAIRE

    Resende, David Nunes; Gibson, David V.; Jarrett, James

    2011-01-01

    The objective of this article is to present a proposal (working paper) for a quantitative analysis tool to help technology transfer offices (TTOs) improve their structures, processes and procedures. Our research started from the study of internal practices and structures that facilitate the interaction between R&D institutions, their TTOs and regional surroundings. We wanted to identify “bottlenecks” in those processes, procedures, and structures. We mapped the bottlenecks in a set of “...

  18. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  19. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  20. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  1. A Collaborative Analysis Tool for Integrated Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald; Landrum, Brian

    2000-01-01

    the process may be repeated altering the trajectory or some other input to reduce the TPS mass. E-PSURBCC is an "engine performance" model and requires the specification of inlet air static temperature and pressure as well as Mach number (which it pulls from the HYFIM and POST trajectory files), and calculates the corresponding stagnation properties. The engine air flow path geometry includes inlet, a constant area section where the rocket is positioned, a subsonic diffuser, a constant area afterburner, and either a converging nozzle or a converging-diverging nozzle. The current capabilities of E-PSURBCC ejector and ramjet mode treatment indicated that various complex flow phenomena including multiple choking and internal shocks can occur for combinations of geometry/flow conditions. For a given input deck defining geometry/flow conditions, the program first goes through a series of checks to establish whether the input parameters are sound in terms of a solution path. If the vehicle/engine performance fails mission goals, the engineer is able to collaboratively alter the vehicle moldline to change aerodynamics, or trajectory, or some other input to achieve orbit. The problem described is an example of the need for collaborative design and analysis. RECIPE is a cross-platform application capable of hosting a number of engineers and designers across the Internet for distributed and collaborative engineering environments. Such integrated system design environments allow for collaborative team design analysis for performing individual or reduced team studies. To facilitate the larger number of potential runs that may need to be made, RECIPE connects the computer codes that calculate the trajectory data, aerodynamic data based on vehicle geometry, heat rate data, TPS masses, and vehicle and engine performance, so that the output from each tool is easily transferred to the model input files that need it.

  2. DNA – A General Energy System Simulation Tool

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic......The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also...

  3. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)). PMID:27170193

  4. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)).

  5. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  6. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  7. Spatial Analysis in Educational Administration: Exploring the Role of G.I.S. (Geographical Information Systems) as an Evaluative Tool in the Public School Board Setting.

    Science.gov (United States)

    Brown, Robert S.; Baird, William; Rosolen, Lisa

    In January 1998, seven school boards amalgamated to form the Toronto District School Board, a board responsible for 600 schools. To deal with the complexities of the new entity, researchers have been using geographical information systems (GIS). GIS are computer-based tools for mapping. They store information as a collection of thematic layers or…

  8. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  9. Organisational Self - Evaluation as a Possible Tool of Organisational Analysis

    OpenAIRE

    Mariann Veresné Somosi

    2004-01-01

    The clue of enduring success of companies / institutes is the ability to recognise new challenges betimes and to react them quickly and flexible. The management however does not dispose of the appropriate tools and methodological knowledge in cases of complex and complicated organisational forming to map fields in critical situations. During this presentation, I examine one of the possible systems of goals and fields of organisational analysis with the help of the organisational analysis proc...

  10. DFTCalc: a tool for efficient fault tree analysis (extended version)

    OpenAIRE

    Arnold, Florian; Belinfante, Axel; Berg, de, MT Mark; Guck, Dennis; Stoelinga, Mariëlle

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ana...

  11. DFTCalc: a tool for efficient fault tree analysis

    OpenAIRE

    Arnold F.; Belinfante A.; Van Der Berg F.; Guck D.; Stoelinga M.

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and it is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ...

  12. Performance Analysis of Anti-Phishing Tools and Study of Classification Data Mining Algorithms for a Novel Anti-Phishing System

    Directory of Open Access Journals (Sweden)

    Rajendra Gupta

    2015-11-01

    Full Text Available The term Phishing is a kind of spoofing website which is used for stealing sensitive and important information of the web user such as online banking passwords, credit card information and user's password etc. In the phishing attack, the attacker generates the warning message to the user about the security issues, ask for confidential information through phishing emails, ask to update the user's account information etc. Several experimental design considerations have been proposed earlier to countermeasure the phishing attack. The earlier systems are not giving more than 90 percentage successful results. In some cases, the system tool gives only 50-60 percentage successful result. In this paper, a novel algorithm is developed to check the performance of the anti-phishing system and compared the received data set with the data set of existing anti-phishing tools. The performance evaluation of novel anti-phishing system is studied with four different classification data mining algorithms which are Class Imbalance Problem (CIP, Rule based Classifier (Sequential Covering Algorithm (SCA, Nearest Neighbour Classification (NNC, Bayesian Classifier (BC on the data set of phishing and legitimate websites. The proposed system shows less error rate and better performance as compared to other existing system tools.

  13. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  14. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-02-21

    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed and simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.

  15. Comparative guide to emerging diagnostic tools for large commercial HVAC systems

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, Hannah; Piette, Mary Ann

    2001-05-01

    This guide compares emerging diagnostic software tools that aid detection and diagnosis of operational problems for large HVAC systems. We have evaluated six tools for use with energy management control system (EMCS) or other monitoring data. The diagnostic tools summarize relevant performance metrics, display plots for manual analysis, and perform automated diagnostic procedures. Our comparative analysis presents nine summary tables with supporting explanatory text and includes sample diagnostic screens for each tool.

  16. Tool, weapon, or white elephant? A realist analysis of the five phases of a twenty-year programme of occupational health information system implementation in the health sector

    Directory of Open Access Journals (Sweden)

    Spiegel Jerry M

    2012-08-01

    Full Text Available Abstract Background Although information systems (IS have been extensively applied in the health sector worldwide, few initiatives have addressed the health and safety of health workers, a group acknowledged to be at high risk of injury and illness, as well as in great shortage globally, particularly in low and middle-income countries. Methods Adapting a context-mechanism-outcome case study design, we analyze our team’s own experience over two decades to address this gap: in two different Canadian provinces; and two distinct South African settings. Applying a realist analysis within an adapted structuration theory framing sensitive to power relations, we explore contextual (socio-political and technological characteristics and mechanisms affecting outcomes at micro, meso and macro levels. Results Technological limitations hindered IS usefulness in the initial Canadian locale, while staffing inadequacies amid pronounced power imbalances affecting governance restricted IS usefulness in the subsequent Canadian application. Implementation in South Africa highlighted the special care needed to address power dynamics regarding both worker-employer relations (relevant to all occupational health settings and North–south imbalances (common to all international interactions. Researchers, managers and front-line workers all view IS implementation differently; relationships amongst the workplace parties and between community and academic partners have been pivotal in determining outcome in all circumstances. Capacity building and applying creative commons and open source solutions are showing promise, as is international collaboration. Conclusions There is worldwide consensus on the need for IS use to protect the health workforce. However, IS implementation is a resource-intensive undertaking; regardless of how carefully designed the software, contextual factors and the mechanisms adopted to address these are critical to mitigate threats and achieve

  17. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  18. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    International Nuclear Information System (INIS)

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  19. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  20. 带电作业工器具自动管理系统应用分析%Application Analysis of Live Line Tool Automatic Management System

    Institute of Scientific and Technical Information of China (English)

    曹国文; 蒋标

    2015-01-01

    In view of complicated formalities of tools receiving and returning, long time-consuming, low work efficiency in the live line tools storehouse in Bayannur Electric Power Bureau, the live line tool automatic management system was adopted. RFID system simplified the the procedure of tools receipt and approval. As implementing, the staff need to take the tools with radio frequency tags through the the doors installed with radio frequency device instruments. The system will record the information of the staff, the time, the name of tools taking out (taking in), and the numbers of the tools, and upload and store automatically the information to live working instruments warehouse computer. The staff can inquire information in this computer and in the office computer through private network, which can not only save working time, improve work efficiency, but also guarantee the safety of the instruments use.%针对内蒙古巴彦淖尔电业局带电作业工具库内工器具领用、归还手续繁琐、耗时长,工作效率低的情况,在该局使用了带电作业工器具自动管理系统(RFID)。该系统简化了带电作业工器具出、入库的流程,操作时,工作人员只需携带贴有射频标签的工器具从装有射频装置的大门通过,该系统就会记录领取工器具的工作人员、时间、带出(回)的工器具名称以及数量信息,并将信息上传至带电作业工器具库房的计算机中,自动保存,工作人员可在该计算机中查询借出、归还、库存信息,也可在办公室计算机中通过专网查询当前工器具信息,无需人工登记,不仅节约了工作时间,提高了工作效率,而且确保了工器具的使用安全。

  1. SABRE: A Tool for Stochastic Analysis of Biochemical Reaction Networks

    CERN Document Server

    Didier, Frederic; Mateescu, Maria; Wolf, Verena

    2010-01-01

    The importance of stochasticity within biological systems has been shown repeatedly during the last years and has raised the need for efficient stochastic tools. We present SABRE, a tool for stochastic analysis of biochemical reaction networks. SABRE implements fast adaptive uniformization (FAU), a direct numerical approximation algorithm for computing transient solutions of biochemical reaction networks. Biochemical reactions networks represent biological systems studied at a molecular level and these reactions can be modeled as transitions of a Markov chain. SABRE accepts as input the formalism of guarded commands, which it interprets either as continuous-time or as discrete-time Markov chains. Besides operating in a stochastic mode, SABRE may also perform a deterministic analysis by directly computing a mean-field approximation of the system under study. We illustrate the different functionalities of SABRE by means of biological case studies.

  2. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  3. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  4. Sociology and Systems Analysis

    OpenAIRE

    Becker, H.A.

    1982-01-01

    The Management and Technology (MMT) Area of IIASA organizes, from time to time, seminars on topics that are of interest in connection with the work at the Institute. Since MMT sees the importance of investigating the broader management aspects when using systems analytical tools, it was of great interest to have Professor Henk Becker from the University of Utrecht give a seminar on "Sociology of Systems Analysis". As his presentation at this seminar should be of interest to a wider audie...

  5. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  6. Emulation tool of dynamic systems via internet

    Directory of Open Access Journals (Sweden)

    Daniel Ruiz Olaya

    2015-11-01

    Full Text Available The experimentation laboratories for the studies of control system courses can become expensive, either in its acquisition, operation or maintenance. An alternative resource have been the remote laboratories. However, not always is possible to get complex systems. A solution to this matter are the remote emulation laboratories. In this paper describes the development of a Web application for the emulation of dynamic systems using a free-distribution software tool of rapid control prototyping based on Linux/RTAI. This application is focused especially for the experimentation with dynamic systems that are not available easily in a laboratory where the model have been configured by the user. The design of the front-end and the back-end are presented. The latency times of the real-time operating system and the ability of the system to reproduce similar signals to a real system from an emulated model were verified. An example, to test the functionality of the application the model of an evaporator was used. One of the advantages of the application is the work methodology which is based on the development of blocks in Scicos. This allows the user to reuse those parameters and the code that was implemented to build a block on the Scicos toolbox with the Linux/RTAI/ScicosLab environment. Moreover, only a web-browser and the Java Virtual Machine are required.

  7. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Directory of Open Access Journals (Sweden)

    Mihaela-Laura IVAN

    2016-05-01

    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.

  8. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  9. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  10. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  11. Analysis and processing tools for nuclear trade related data

    International Nuclear Information System (INIS)

    This paper describes the development of a system used by the Nuclear Trade Analysis Unit of the Department of Safeguards for handling, processing, analyzing, reporting and storing nuclear trade related data. The data handling and analysis part of the system is already functional, but several additional features are being added to optimize its use. The aim is to develop the system in a manner that actively contributes to the management of the Department's overall knowledge and supports the departmental State evaluation process. Much of the data originates from primary sources and comes in many different formats and languages. It also comes with diverse security needs. The design of the system has to meet the special challenges set by the large volume and different types of data that needs to be handled in a secure and reliable environment. Data is stored in a form appropriate for access and analysis in both structured and unstructured formats. The structured data is entered into a database (knowledge base) called the Procurement Tracking System (PTS). PTS allows effective linking, visualization and analysis of new data with that already included in the system. The unstructured data is stored in text searchable folders (information base) equipped with indexing and search capabilities. Several other tools are linked to the system including a visual analysis tool for structured information and a system for visualizing unstructured data. All of which are designed to help the analyst locate the specific information required amongst a myriad of unrelated information. This paper describes the system's concept, design and evolution - highlighting its special features and capabilities, which include the need to standardize the data collection, entry and analysis processes. All this enables the analyst to approach tasks consistently and in a manner that both enhances teamwork and leads to the development of an institutional memory related to cover trade activities that can be

  12. Discovery and New Frontiers Project Budget Analysis Tool

    Science.gov (United States)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  13. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  14. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  15. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    Directory of Open Access Journals (Sweden)

    Chun-Hsiao Wu

    2016-05-01

    Full Text Available In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter, the target data sets, and appropriate social sensors for analysis. By adopting parameter templates, one can quickly apply the experience of other experts at the beginning of a new case or even create one’s own templates. We have also modularized the analysis tools into two social sensors: Language Sensor and Text Sensor. A user evaluation was conducted and the results showed that usefulness, modularity, reusability, and manageability of the system were all very positive. The results also show that this tool can greatly reduce the time needed to perform data analysis, solve the problems encountered in traditional analysis process, and obtained useful results. The experimental results reveal that the concept of social sensor and the proposed system design are useful for big data analysis of social media.

  16. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  17. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  18. ANN Based Tool Condition Monitoring System for CNC Milling Machines

    Directory of Open Access Journals (Sweden)

    Mota-Valtierra G.C.

    2011-10-01

    Full Text Available Most of the companies have as objective to manufacture high-quality products, then by optimizing costs, reducing and controlling the variations in its production processes it is possible. Within manufacturing industries a very important issue is the tool condition monitoring, since the tool state will determine the quality of products. Besides, a good monitoring system will protect the machinery from severe damages. For determining the state of the cutting tools in a milling machine, there is a great variety of models in the industrial market, however these systems are not available to all companies because of their high costs and the requirements of modifying the machining tool in order to attach the system sensors. This paper presents an intelligent classification system which determines the status of cutt ers in a Computer Numerical Control (CNC milling machine. This tool state is mainly detected through the analysis of the cutting forces drawn from the spindle motors currents. This monitoring system does not need sensors so it is no necessary to modify the machine. The correct classification is made by advanced digital signal processing techniques. Just after acquiring a signal, a FIR digital filter is applied to the data to eliminate the undesired noisy components and to extract the embedded force components. A Wavelet Transformation is applied to the filtered signal in order to compress the data amount and to optimize the classifier structure. Then a multilayer perceptron- type neural network is responsible for carrying out the classification of the signal. Achieving a reliability of 95%, the system is capable of detecting breakage and a worn cutter.

  19. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP)

  20. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  1. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  2. An integrated analytic tool and knowledge-based system approach to aerospace electric power system control

    Science.gov (United States)

    Owens, William R.; Henderson, Eric; Gandikota, Kapal

    1986-10-01

    Future aerospace electric power systems require new control methods because of increasing power system complexity, demands for power system management, greater system size and heightened reliability requirements. To meet these requirements, a combination of electric power system analytic tools and knowledge-based systems is proposed. The continual improvement in microelectronic performance has made it possible to envision the application of sophisticated electric power system analysis tools to aerospace vehicles. These tools have been successfully used in the measurement and control of large terrestrial electric power systems. Among these tools is state estimation which has three main benefits. The estimator builds a reliable database for the system structure and states. Security assessment and contingency evaluation also require a state estimator. Finally, the estimator will, combined with modern control theory, improve power system control and stability. Bad data detection as an adjunct to state estimation identifies defective sensors and communications channels. Validated data from the analytic tools is supplied to a number of knowledge-based systems. These systems will be responsible for the control, protection, and optimization of the electric power system.

  3. Validating and Verifying a New Thermal-Hydraulic Analysis Tool

    International Nuclear Information System (INIS)

    The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3DC/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created using RELAP5-3DC/ATHENA. Both steady-state and transient calculations can be performed, using many working fluids and point to three-dimensional neutronics. A general description of the techniques used to couple the codes is given. The validation and verification (V and V) matrix is outlined. V and V is presently ongoing. (authors)

  4. Systems biology: A tool for charting the antiviral landscape.

    Science.gov (United States)

    Bowen, James R; Ferris, Martin T; Suthar, Mehul S

    2016-06-15

    The host antiviral programs that are initiated following viral infection form a dynamic and complex web of responses that we have collectively termed as "the antiviral landscape". Conventional approaches to studying antiviral responses have primarily used reductionist systems to assess the function of a single or a limited subset of molecules. Systems biology is a holistic approach that considers the entire system as a whole, rather than individual components or molecules. Systems biology based approaches facilitate an unbiased and comprehensive analysis of the antiviral landscape, while allowing for the discovery of emergent properties that are missed by conventional approaches. The antiviral landscape can be viewed as a hierarchy of complexity, beginning at the whole organism level and progressing downward to isolated tissues, populations of cells, and single cells. In this review, we will discuss how systems biology has been applied to better understand the antiviral landscape at each of these layers. At the organismal level, the Collaborative Cross is an invaluable genetic resource for assessing how genetic diversity influences the antiviral response. Whole tissue and isolated bulk cell transcriptomics serves as a critical tool for the comprehensive analysis of antiviral responses at both the tissue and cellular levels of complexity. Finally, new techniques in single cell analysis are emerging tools that will revolutionize our understanding of how individual cells within a bulk infected cell population contribute to the overall antiviral landscape.

  5. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  6. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  7. T4SP Database 2.0: An Improved Database for Type IV Secretion Systems in Bacterial Genomes with New Online Analysis Tools

    Science.gov (United States)

    Han, Na; Yu, Weiwen; Qiang, Yujun

    2016-01-01

    Type IV secretion system (T4SS) can mediate the passage of macromolecules across cellular membranes and is essential for virulent and genetic material exchange among bacterial species. The Type IV Secretion Project 2.0 (T4SP 2.0) database is an improved and extended version of the platform released in 2013 aimed at assisting with the detection of Type IV secretion systems (T4SS) in bacterial genomes. This advanced version provides users with web server tools for detecting the existence and variations of T4SS genes online. The new interface for the genome browser provides a user-friendly access to the most complete and accurate resource of T4SS gene information (e.g., gene number, name, type, position, sequence, related articles, and quick links to other webs). Currently, this online database includes T4SS information of 5239 bacterial strains. Conclusions. T4SS is one of the most versatile secretion systems necessary for the virulence and survival of bacteria and the secretion of protein and/or DNA substrates from a donor to a recipient cell. This database on virB/D genes of the T4SS system will help scientists worldwide to improve their knowledge on secretion systems and also identify potential pathogenic mechanisms of various microbial species.

  8. A Collaborative Analysis Tool for Integrating Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald

    1999-01-01

    Presented is a computer-based tool that connects several disciplines that are needed in the complex and integrated design of high performance reusable single stage to orbit (SSTO) vehicles. Every system is linked to every other system, as is the case of SSTO vehicles with air breathing propulsion, which is currently being studied by NASA. The deficiencies in the scramjet powered concept led to a revival of interest in Rocket-Based Combined-Cycle (RBCC) propulsion systems. An RBCC propulsion system integrates airbreathing and rocket propulsion into a single engine assembly enclosed within a cowl or duct. A typical RBCC propulsion system operates as a ducted rocket up to approximately Mach 3. At this point the transitions to a ramjet mode for supersonic-to-hypersonic acceleration. Around Mach 8 the engine transitions to a scram4jet mode. During the ramjet and scramjet modes, the integral rockets operate as fuel injectors. Around Mach 10-12 (the actual value depends on vehicle and mission requirements), the inlet is physically closed and the engine transitions to an integral rocket mode for orbit insertion. A common feature of RBCC propelled vehicles is the high degree of integration between the propulsion system and airframe. At high speeds the vehicle forebody is fundamentally part of the engine inlet, providing a compression surface for air flowing into the engine. The compressed air is mixed with fuel and burned. The combusted mixture must be expanded to an area larger than the incoming stream to provide thrust. Since a conventional nozzle would be too large, the entire lower after body of the vehicle is used as an expansion surface. Because of the high external temperatures seen during atmospheric flight, the design of an airbreathing SSTO vehicle requires delicate tradeoffs between engine design, vehicle shape, and thermal protection system (TPS) sizing in order to produce an optimum system in terms of weight (and cost) and maximum performance.

  9. 3D-Aided-Analysis Tool for Lunar Rover

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; LI Guo-peng; REN Xin; LIU Jian-jun; GAO Xing-ye; ZOU Xiao-duan

    2013-01-01

    3D-Aided-Analysis Tool (3DAAT) which is a virtual reality system is built up in this paper. 3DAAT is integrated with kinematics and dynamics model of rover as well as real lunar surface terrain mode. Methods of modeling which are proposed in this paper include constructing lunar surface, constructing 3D model of lander and rover, building up kinematic model of rover body. Photogrammetry technique and the remote sensing information are used to generate the terrain model of lunar surface. According to the implementation result, 3DAAT is an effective assist system for making exploration plan and analyzing the status of rover.

  10. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  11. System analysis and design

    International Nuclear Information System (INIS)

    This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.

  12. KAOS: A Kinetic Theory Tool for Modeling Complex Social Systems

    Directory of Open Access Journals (Sweden)

    Bruneo Dario

    2016-01-01

    Full Text Available The kinetic theory approach is successfully used to model complex phenomena related to social systems, allowing to predict the dynamics and emergent behavior of large populations of agents. In particular, kinetic theory for active particles (KTAP models are usually analyzed by numerically solving the underlying Boltzmann-type differential equations through ad-hoc implementations. In this paper, we present KAOS: a kinetic theory of active particles modeling and analysis software tool. To the best of our knowledge, KAOS represents the first attempt to design and implement a comprehensive tool that assists the user in all the steps of the modeling process in the framework of the kinetic theories, from the model definition to the representation of transient solutions. To show the KAOS features, we present a new model capturing the competition/cooperation dynamics of a socio-economic system with welfare dynamics, in different socio-political conditions

  13. Analysis and use of OLAP tools in the corporative information system of CFE; Analisis y empleo de herramientas OLAP en el sistema de informacion corporativa de la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Jacome G, Norma E; Argotte R, Liliana P; Mejia L, Manuel [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2003-07-01

    The commercial tools Oracle Express and Oracle the Discoverer are presented, applied and compared. Both of them were applied in the context of the Data Storage that Comision Federal de Electricidad (CFE) develops, denominated Corporative Information System (SICORP) that involves the handling of large volumes of historical and present information. By the importance and the relevance that at the moment the subject of data storage has, the experiences that are described in the article are very useful for future developments. [Spanish] Se presentan, aplican y comparan las herramientas comerciales Oracle Express y Oracle Discoverer. Ambas fueron aplicadas en el contexto del Almacen de Datos que la Comision Federal de Electricidad (CFE) desarrolla, denominado Sistema de Informacion Corporativa (Sicorp), que involucra el manejo de grandes volumenes de informacion historica y actual. Por la importancia y el auge que actualmente tiene el tema de almacenes de datos, las experiencias que se describen en el articulo son de gran utilidad para futuros desarrollos.

  14. Chip breaking system for automated machine tool

    Science.gov (United States)

    Arehart, Theodore A.; Carey, Donald O.

    1987-01-01

    The invention is a rotary selectively directional valve assembly for use in an automated turret lathe for directing a stream of high pressure liquid machining coolant to the interface of a machine tool and workpiece for breaking up ribbon-shaped chips during the formation thereof so as to inhibit scratching or other marring of the machined surfaces by these ribbon-shaped chips. The valve assembly is provided by a manifold arrangement having a plurality of circumferentially spaced apart ports each coupled to a machine tool. The manifold is rotatable with the turret when the turret is positioned for alignment of a machine tool in a machining relationship with the workpiece. The manifold is connected to a non-rotational header having a single passageway therethrough which conveys the high pressure coolant to only the port in the manifold which is in registry with the tool disposed in a working relationship with the workpiece. To position the machine tools the turret is rotated and one of the tools is placed in a material-removing relationship of the workpiece. The passageway in the header and one of the ports in the manifold arrangement are then automatically aligned to supply the machining coolant to the machine tool workpiece interface for breaking up of the chips as well as cooling the tool and workpiece during the machining operation.

  15. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  16. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  17. Tool management in manufacturing systems equipped with CNC machines

    Directory of Open Access Journals (Sweden)

    Giovanni Tani

    1997-12-01

    Full Text Available This work has been carried out for the purpose of realizing an automated system for the integrated management of tools within a company. By integrating planning, inspection and tool-room functions, automated tool management can ensure optimum utilization of tools on the selected machines, guaranteeing their effective availability. The first stage of the work consisted of defining and developing a Tool Management System whose central nucleus is a unified Data Base for all of the tools, forming part of the company's Technological Files (files on machines, materials, equipment, methods, etc., interfaceable with all of the company departments that require information on tools. The system assigns code numbers to the individual components of the tools and file them on the basis of their morphological and functional characteristics. The system is also designed to effect assemblies of tools, from which are obtained the "Tool Cards" required for compiling working cycles (CAPP, for CAM programming and for the Tool-room where the tools are physically prepared. Methods for interfacing with suitable systems for the aforesaid functions have also been devised

  18. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  19. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  20. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  1. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  2. Interactive Graphics Tools for Analysis of MOLA and Other Data

    Science.gov (United States)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  3. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  4. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  5. Shot planning and analysis tools on the NIF project

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, R. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Casey, A., E-mail: casey20@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Conder, A.; Fallejo, R.; Flegel, M.; Hutton, M.; Jancaitis, K.; Lakamsani, V.; Potter, D.; Reisdorf, S.; Tappero, J.; Whitman, P.; Carr, W.; Liao, Z. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Target shots in NIF, dozens a month, vary widely in laser and target configuration. Black-Right-Pointing-Pointer A planning tool helps select shot sequences that optimize valuable facility time. Black-Right-Pointing-Pointer Fabrication and supply of targets, diagnostics, etc. are integrated into the plan. Black-Right-Pointing-Pointer Predictive modeling of aging parts (e.g., optics) aids maintenance decision support. Black-Right-Pointing-Pointer We describe the planning/analysis tool and its use in NIF experimental operations. - Abstract: Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Campaign Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modeling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics

  6. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  7. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri;

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  8. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  9. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  10. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  11. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  12. Expert Systems as Tools for Technical Communicators.

    Science.gov (United States)

    Grider, Daryl A.

    1994-01-01

    Discusses expertise, what an expert system is, what an expert system shell is, what expert systems can and cannot do, knowledge engineering and technical communicators, and planning and managing expert system projects. (SR)

  13. Dynamic drag force based on iterative density mapping: A new numerical tool for three-dimensional analysis of particle trajectories in a dielectrophoretic system.

    Science.gov (United States)

    Knoerzer, Markus; Szydzik, Crispin; Tovar-Lopez, Francisco Javier; Tang, Xinke; Mitchell, Arnan; Khoshmanesh, Khashayar

    2016-02-01

    Dielectrophoresis is a widely used means of manipulating suspended particles within microfluidic systems. In order to efficiently design such systems for a desired application, various numerical methods exist that enable particle trajectory plotting in two or three dimensions based on the interplay of hydrodynamic and dielectrophoretic forces. While various models are described in the literature, few are capable of modeling interactions between particles as well as their surrounding environment as these interactions are complex, multifaceted, and computationally expensive to the point of being prohibitive when considering a large number of particles. In this paper, we present a numerical model designed to enable spatial analysis of the physical effects exerted upon particles within microfluidic systems employing dielectrophoresis. The model presents a means of approximating the effects of the presence of large numbers of particles through dynamically adjusting hydrodynamic drag force based on particle density, thereby introducing a measure of emulated particle-particle and particle-liquid interactions. This model is referred to as "dynamic drag force based on iterative density mapping." The resultant numerical model is used to simulate and predict particle trajectory and velocity profiles within a microfluidic system incorporating curved dielectrophoretic microelectrodes. The simulated data are compared favorably with experimental data gathered using microparticle image velocimetry, and is contrasted against simulated data generated using traditional "effective moment Stokes-drag method," showing more accurate particle velocity profiles for areas of high particle density.

  14. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  15. Vulnerability assessment using two complementary analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  16. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  17. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  18. Using Visual Tools for Analysis and Learning

    OpenAIRE

    Burton, Rob; Barlow, Nichola; Barker, Caroline

    2010-01-01

    This pack is intended as a resource for lecturers and students to facilitate the further development of their learning and teaching strategies. Visual tools were initially introduced within a module of the Year 3 nursing curriculum within the University of Huddersfield by Dr Rob Burton. Throughout the period of 2007-2008 a small team of lecturers with a keen interest in this teaching and learning strategy engaged in exploring and reviewing the literature. They also attended a series of loc...

  19. Computer tools for systems engineering at LaRC

    Science.gov (United States)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  20. Hybrid-Electric Aircraft TOGW Development Tool with Empirically-Based Airframe and Physics-Based Hybrid Propulsion System Component Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Hybrid-Electric distributed propulsion (HEDP) is becoming widely accepted and new tools will be required for future development. This Phase I SBIR proposal creates...

  1. A Components Library System Model and the Support Tool

    Institute of Scientific and Technical Information of China (English)

    MIAO Huai-kou; LIU Hui; LIU Jing; LI Xiao-bo

    2004-01-01

    Component-based development needs a well-designed components library and a set of support tools.This paper presents the design and implementation of a components library system model and its support tool UMLCASE.A set of practical CASE tools is constructed.UMLCASE can use UML to design Use Case Diagram, Class Diagram etc.And it integrates with components library system.

  2. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  3. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  4. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  5. Energy-Systems Economic Analysis

    Science.gov (United States)

    Doane, J.; Slonski, M. L.; Borden, C. S.

    1982-01-01

    Energy Systems Economic Analysis (ESEA) program is flexible analytical tool for rank ordering of alternative energy systems. Basic ESEA approach derives an estimate of those costs incurred as result of purchasing, installing and operating an energy system. These costs, suitably aggregated into yearly costs over lifetime of system, are divided by expected yearly energy output to determine busbar energy costs. ESEA, developed in 1979, is written in FORTRAN IV for batch execution.

  6. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  7. Reverse Engineering Tool Requirements for Real Time Embedded Systems

    OpenAIRE

    Govin, Brice; Anquetil, Nicolas; Etien, Anne; Monegier Du Sorbier, Arnaud; Ducasse, Stéphane

    2015-01-01

    For more than three decades, reverse engineering has been a major issue in industry wanting to capitalise on legacy systems. Lots of companies have developed reverse engineering tools in order to help developers in their work. However, those tools have been focusing on traditional information systems. Working on a time critical embedded system we found that the solutions available focus either on software behaviour structuring or on data extraction from the system. None of them seem to be cle...

  8. Fully Parallel MHD Stability Analysis Tool

    Science.gov (United States)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  9. System for exchanging tools and end effectors on a robot

    Science.gov (United States)

    Burry, David B.; Williams, Paul M.

    1991-02-19

    A system and method for exchanging tools and end effectors on a robot permits exchange during a programmed task. The exchange mechanism is located off the robot, thus reducing the mass of the robot arm and permitting smaller robots to perform designated tasks. A simple spring/collet mechanism mounted on the robot is used which permits the engagement and disengagement of the tool or end effector without the need for a rotational orientation of the tool to the end effector/collet interface. As the tool changing system is not located on the robot arm no umbilical cords are located on robot.

  10. Regional energy planning through SWOT analysis and strategic planning tools.

    Energy Technology Data Exchange (ETDEWEB)

    Terrados, J.; Almonacid, G.; Hontoria, L. [Research Group IDEA, Polytechnics School, Campus Las Lagunillas, Edificio A3, University of Jaen, 23071 Jaen (Spain)

    2007-08-15

    Strategic planning processes, which are commonly used as a tool for region development and territorial structuring, can be harnessed by politicians and public administrations, at the local level, to redesign the regional energy system and encourage renewable energy development and environmental preservation. In this sense, the province of Jaen, a southern Spanish region whose economy is mainly based on olive agriculture, has carried out its strategic plan aiming at a major socioeconomic development. Under the leadership of the provincial government and the University of Jaen, main provincial institutions joined to propose the elaboration of a participatory strategic plan for the whole province. Here, the elaboration of the energy part of the plan, which was directly focused on the exploitation of renewable resources, mainly solar and biomass energy, and which highlights the effectiveness of techniques from business management applied to a sustainable energy model design is presented. Renewable Energy development during the first years of plan execution is presented, and the impact of additional issues is discussed. It is concluded that, although multicriteria decision-making technologies (MCDA) are extensively used in energy planning, a different approach can be utilized to incorporate techniques from strategic analysis. Furthermore, SWOT (strengths, weaknesses, opportunities and threats) analysis has proved to be an effective tool and has constituted a suitable baseline to diagnose current problems and to sketch future action lines. (author)

  11. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  12. Safety management systems. Audit tools and reliability of auditing

    Energy Technology Data Exchange (ETDEWEB)

    Kuusisto, A. [VTT Automation, Espoo (Finland). Safety Engineering

    2000-12-01

    Safety auditing is a systematic method to evaluate a company's safety management system. This work concentrates on evaluating the reliability of some safety audit tools. Firstly, the factors affecting reliability in auditing are clarified. Secondly, the inter-observer reliability of one of the audit tools is tested. This was done using an audit method, known as the D and S method, in six industrial companies in the USA, and in three companies in Finland. Finally, a new improved audit method called MISHA was developed, and its reliability was tested in two industrial companies. The results of the work show that safety audit tools do not ensure reliable and valid audit results. The auditor's expertise in the field of health and safety is particularly important when the company's compliance with the legal requirements is evaluated. A reasonably high reliability in the use of the D and S can be achieved when the auditor is familiar with the audit tool, the national legislation, and the company's culture. The MISHA method gives more reliable results than D and S when the auditor is not trained. On the other hand, it seems that the D and S is more reliable when the auditor is a trained expert. Some differences were found between the companies in the USA and in Finland. The organization and administration of safety activities was at a somewhat higher level among the companies in the USA. Industrial hazard control, as well as the control of fire hazards and industrial hygiene were at a high level in all companies in both countries. Most dispersion occurred in supervision, participation, motivation, and training activities. Finally, accident investigation and analysis were significantly better arranged among the companies in the USA. The results are in line with the findings of the literature survey on national differences in safety management procedures. (orig.)

  13. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; McCracken, Albert G.

    1988-05-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  14. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  15. Analysis of hybrid solar systems

    Science.gov (United States)

    Swisher, J.

    1980-10-01

    The TRNSYS simulation program was used to evaluate the performance of active charge/passive discharge solar systems with water as the working fluid. TRNSYS simulations are used to evaluate the heating performance and cooling augmentation provided by systems in several climates. The results of the simulations are used to develop a simplified analysis tool similar to the F-chart and Phi-bar procedures used for active systems. This tool, currently in a preliminary stage, should provide the designer with quantitative performance estimates for comparison with other passive, active, and nonsolar heating and cooling designs.

  16. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  17. PSIM: A TOOL FOR ANALYSIS OF DEVICE PAIRING METHODS

    Directory of Open Access Journals (Sweden)

    Yasir Arfat Malkani

    2009-10-01

    Full Text Available Wireless networks are a common place nowadays and almost all of the modern devices support wireless communication in some form. These networks differ from more traditional computing systems due tothe ad-hoc and spontaneous nature of interactions among devices. These systems are prone to security risks, such as eavesdropping and require different techniques as compared to traditional securitymechanisms. Recently, secure device pairing in wireless environments has got substantial attention from many researchers. As a result, a significant set of techniques and protocols have been proposed to deal with this issue. Some of these techniques consider devices equipped with infrared, laser, ultrasound transceivers or 802.11 network interface cards; while others require embedded accelerometers, cameras and/or LEDs, displays, microphones and/or speakers. However, many of the proposed techniques or protocols have not been implemented at all; while others are implemented and evaluated in a stand-alone manner without being compared with other related work [1]. We believe that it is because of the lack of specialized tools that provide a common platform to test thepairing methods. As a consequence, we designed such a tool. In this paper, we are presenting design and development of the Pairing Simulator (PSim that can be used to perform the analysis of devicepairing methods.

  18. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  19. Languages and tools for hybrid systems design

    OpenAIRE

    Carloni, LP; Passerone, R.; Pinto, A.; Sangiovanni-Vincentelli, AL

    2006-01-01

    The explosive growth of embedded electronics is bringing information and control systems of increasing complexity to every aspects of our lives. The most challenging designs are safety-critical systems, such as transportation systems (e.g., airplanes, cars, and trains), industrial plants and health care monitoring. The difficulties reside in accommodating constraints both on functionality and implementation. The correct behavior must be guaranteed under diverse states of the environment and p...

  20. A tool for searching in information systems under uncertainty

    Science.gov (United States)

    Walek, Bogdan; Farana, Radim

    2016-06-01

    This article deals with a design of a tool for searching in information systems under uncertainty. During the search, user data often works with uncertainty, which may lead to a lack of the desired result or to find a large number of results that the user must evaluate. The main goal of the proposed tool is to process vague information and find relevant data. The article describes in detail various steps of the proposed tool.

  1. Multilingual lexicon design tool and database management system for MT

    OpenAIRE

    Barisevičius, G.; Tamulynas, B.

    2011-01-01

    The paper presents the design and development of English-Lithuanian-English dictionarylexicon tool and lexicon database management system for MT. The system is oriented to support two main requirements: to be open to the user and to describe much more attributes of speech parts as a regular dictionary that are required for the MT. Programming language Java and database management system MySql is used to implement the designing tool and lexicon database respectively. This solution allows easil...

  2. Building Systems: Passing Fad or Basic Tool?

    Science.gov (United States)

    Rezab, Donald

    Building systems can be traced back to a 1516 A.D. project by Leonardo da Vinci and to a variety of prefabrication projects in every succeeding century. When integrated into large and repetitive spatial units through careful design, building systems can produce an architecture of the first order, as evidenced in the award winning design of…

  3. High power laser perforating tools and systems

    Science.gov (United States)

    Zediker, Mark S; Rinzler, Charles C; Faircloth, Brian O; Koblick, Yeshaya; Moxley, Joel F

    2014-04-22

    ystems devices and methods for the transmission of 1 kW or more of laser energy deep into the earth and for the suppression of associated nonlinear phenomena. Systems, devices and methods for the laser perforation of a borehole in the earth. These systems can deliver high power laser energy down a deep borehole, while maintaining the high power to perforate such boreholes.

  4. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  5. Active ultrasound pattern injection system (AUSPIS for interventional tool guidance.

    Directory of Open Access Journals (Sweden)

    Xiaoyu Guo

    Full Text Available Accurate tool tracking is a crucial task that directly affects the safety and effectiveness of many interventional medical procedures. Compared to CT and MRI, ultrasound-based tool tracking has many advantages, including low cost, safety, mobility and ease of use. However, surgical tools are poorly visualized in conventional ultrasound images, thus preventing effective tool tracking and guidance. Existing tracking methods have not yet provided a solution that effectively solves the tool visualization and mid-plane localization accuracy problem and fully meets the clinical requirements. In this paper, we present an active ultrasound tracking and guiding system for interventional tools. The main principle of this system is to establish a bi-directional ultrasound communication between the interventional tool and US imaging machine within the tissue. This method enables the interventional tool to generate an active ultrasound field over the original imaging ultrasound signals. By controlling the timing and amplitude of the active ultrasound field, a virtual pattern can be directly injected into the US machine B mode display. In this work, we introduce the time and frequency modulation, mid-plane detection, and arbitrary pattern injection methods. The implementation of these methods further improves the target visualization and guiding accuracy, and expands the system application beyond simple tool tracking. We performed ex vitro and in vivo experiments, showing significant improvements of tool visualization and accurate localization using different US imaging platforms. An ultrasound image mid-plane detection accuracy of ±0.3 mm and a detectable tissue depth over 8.5 cm was achieved in the experiment. The system performance is tested under different configurations and system parameters. We also report the first experiment of arbitrary pattern injection to the B mode image and its application in accurate tool tracking.

  6. Active ultrasound pattern injection system (AUSPIS) for interventional tool guidance.

    Science.gov (United States)

    Guo, Xiaoyu; Kang, Hyun-Jae; Etienne-Cummings, Ralph; Boctor, Emad M

    2014-01-01

    Accurate tool tracking is a crucial task that directly affects the safety and effectiveness of many interventional medical procedures. Compared to CT and MRI, ultrasound-based tool tracking has many advantages, including low cost, safety, mobility and ease of use. However, surgical tools are poorly visualized in conventional ultrasound images, thus preventing effective tool tracking and guidance. Existing tracking methods have not yet provided a solution that effectively solves the tool visualization and mid-plane localization accuracy problem and fully meets the clinical requirements. In this paper, we present an active ultrasound tracking and guiding system for interventional tools. The main principle of this system is to establish a bi-directional ultrasound communication between the interventional tool and US imaging machine within the tissue. This method enables the interventional tool to generate an active ultrasound field over the original imaging ultrasound signals. By controlling the timing and amplitude of the active ultrasound field, a virtual pattern can be directly injected into the US machine B mode display. In this work, we introduce the time and frequency modulation, mid-plane detection, and arbitrary pattern injection methods. The implementation of these methods further improves the target visualization and guiding accuracy, and expands the system application beyond simple tool tracking. We performed ex vitro and in vivo experiments, showing significant improvements of tool visualization and accurate localization using different US imaging platforms. An ultrasound image mid-plane detection accuracy of ±0.3 mm and a detectable tissue depth over 8.5 cm was achieved in the experiment. The system performance is tested under different configurations and system parameters. We also report the first experiment of arbitrary pattern injection to the B mode image and its application in accurate tool tracking.

  7. Knowledge-based decision support system for tool management in flexible manufacturing system

    Institute of Scientific and Technical Information of China (English)

    周炳海; 奚立峰; 蔡建国

    2004-01-01

    Tool management is not a single, simple activity, it is comprised of a complex set of functions, especially in a flexible manufacturing system (FMS) environment. The issues associated with tool management include tool requirement planning, tool real-time scheduling, tool crib management, tool inventory control, tool fault diagnosis, tool tracking and tool monitoring. In order to make tools flow into/out of FMS efficiently, this work is aimed to design a knowledge-based decision support system (KBDSS) for tool management in FMS. Firstly an overview of tool management functions is described. Then the structure of KBDSS for tool management and the essential agents in the design of KBDSS are presented. Finally the individual agents of KBDSS are discussed for design and development.

  8. Structure Design and Analysis of Spindle System of Special Double-tool Vertical Lathe%专用双刀立式车床主轴系统结构设计及分析

    Institute of Scientific and Technical Information of China (English)

    应富强; 李良艺; 吴灵东; 汪意

    2011-01-01

    对加工法兰盘的专用双刀立式车床主轴系统的布置进行分析,根据双刀立式车床的受力特点,以及盘类零件的加工特点,给出合理的轴段布置及轴承布置.基于有限元的方法对主轴一般受力情况进行力学分析,得出车床主轴设计时首先应该满足设计要求.对于双刀立式车床主轴的设计给出一般性的思路.%The arrangement of the spindle system of special double-tool vertical lathe for machining of flanges is analyzed Based on the mechanical characteristics of the double-tool vertical lathe and the machining features of disc parts, reasonable shaft segment layout and bearing arrangement are provided. The mechanical analysis for stress on the spindle is carried out with the aid of finite element method and the conclusion is that design of lathe spindle should meet the design requirement first.The general ideas about the design of spindle of double-tool vertical lathe are presented.

  9. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  10. Geographical Information Systems: A Tool for Institutional Research.

    Science.gov (United States)

    Prather, James E.; Carlson, Christina E.

    This paper addresses the application of Geographical Information Systems (GIS), a computerized tool for associating key information by geographical location, to the institutional research function at institutions of higher education. The first section investigates the potential of GIS as an analytical and planning tool for institutional…

  11. Theoretical analysis tools in building business competitiveness

    Directory of Open Access Journals (Sweden)

    Yeisson Diego Tamayo

    2015-12-01

    Full Text Available Due to the internationalization of markets from firms free trade agreements and the standardization process in Colombia companies increasingly seek to satisfy the needs of their customers, so the business management systems take a heavy weight business development phases. In Colombia management systems have a boom in the financial system, so much so that there is manual quality of financial supervision of Colombia, but at the microeconomic level firms have not developed or at least there is no evidence of development that topic. Therefore it is necessary to analyze models of business management at international level in order to identify which elements or strategies can be applied by stages of business development, based on the measurement of indicator variables compliance management department in context Colombian.

  12. Industrial geospatial analysis tool for energy evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Alkadi, Nasr E.; Starke, Michael R.

    2016-06-28

    An industrial analytic system processes industrial data. A database engine provides access to a plurality of database management systems that serve energy consumption and product sales data. An input filter that selectively passes the filtered data streams that comprise energy sales data, location data, and a business classification code data in datasets by removing selected datasets that do not include energy information. A standard deviation filter removes datasets from the filtered data streams that fall outside of a predetermined variation from an average value. A computation module analyzes the correlation between electrical energy consumption within a standard industrial classification code represented in the datasets and a programmable criterion.

  13. Giyoo Hatano's Analysis of Psychological Tools

    Science.gov (United States)

    Cole, Michael

    2007-01-01

    This paper focuses on the relation between two areas of research which owe a great debt to the work of Giyoo Hatano: the ways in which the use of the abacus mediates arithmetic problem solving and the way in which the use of the kanji writing system mediates the interpretation of unfamiliar words. These examples are related to L. S. Vygotsky's…

  14. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  15. A 3D image analysis tool for SPECT imaging

    Science.gov (United States)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  16. Comparison of emerging diagnostic tools for large commercial HVAC systems

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, Hannah; Piette, Mary Ann

    2001-04-06

    Diagnostic software tools for large commercial buildings are being developed to help detect and diagnose energy and other performance problems with building operations. These software applications utilize energy management control system (EMCS) trend log data. Due to the recent development of diagnostic tools, there has been little detailed comparison among the tools and a limited awareness of tool capabilities by potential users. Today, these diagnostic tools focus mainly on air handlers, but the opportunity exists for broadening the scope of the tools to include all major parts of heating, cooling, and ventilation systems in more detail. This paper compares several tools in the following areas: (1) Scope, intent, and background; (2) Data acquisition, pre-processing, and management; (3) Problems detected; (4) Raw data visualization; (5) Manual and automated diagnostic methods and (6) Level of automation. This comparison is intended to provide practitioners and researchers with a picture of the current state of diagnostic tools. There is tremendous potential for these tools to help improve commercial building energy and non-energy performance.

  17. Integrated analysis environment for high impact systems

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, M. [Sandia National Labs., Albuquerque, NM (United States); Davis, J.; Scott, J.; Sztipanovits, J.; Karsai, G. [Vanderbilt Univ., Nashville, TN (United States). Measurement and Computing Systems Lab.

    1998-02-01

    Modeling and analysis of high consequence, high assurance systems requires special modeling considerations. System safety and reliability information must be captured in the models. Previously, high consequence systems were modeled using separate, disjoint models for safety, reliability, and security. The MultiGraph Architecture facilitates the implementation of a model integrated system for modeling and analysis of high assurance systems. Model integrated computing allows an integrated modeling technique to be applied to high consequence systems. Among the tools used for analyzing safety and reliability are a behavioral simulator and an automatic fault tree generation and analysis tool. Symbolic model checking techniques are used to efficiently investigate the system models. A method for converting finite state machine models to ordered binary decision diagrams allows the application of symbolic model checking routines to the integrated system models. This integrated approach to modeling and analysis of high consequence systems ensures consistency between the models and the different analysis tools.

  18. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  19. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul;

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... built-op and developed during several national research projects, carried out at Risø DTU National Laboratory for Sustainable Energy and Aalborg University, in the period 2001-2007. The overall objective of these projects was to create a wind turbine model database able to support the analysis......-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model...

  20. Precision Farming Tools. Global Positioning System (GPS)

    OpenAIRE

    Grisso, Robert D. (Robert Dwight), 1956-; Alley, Mark M.; Heatwole, Conrad D.

    2005-01-01

    By knowing location, farmers can look at the field as a group of small zones and determine if the field is uniform or not. Computers and geographical information systems (GIS) enable producers to record location and other information. With this information practices that may improve efficiency and increase profitability can be considered.

  1. 数控机床伺服进给系统状态监测方案的分析比较%Comparing and Analysis of Schemes of Condition Monitoring for CNC Machine Tools Servo Feed System

    Institute of Scientific and Technical Information of China (English)

    韩军; 常瑞丽

    2014-01-01

    为了得到准确获取机床状态监测信号的方法,研究了机床状态信号获取的3种方案,即外置传感器信号、内置传感器信号和伺服驱动器监测端口信号。这3种信号都是可靠的信息来源,但是信号获取的难度和信号质量各有优劣。分析比较了这3种方案,分析结果对研究数控机床伺服进给系统在线监测技术、提高机床可靠性和加工质量具有参考意义。%In order to get methods of accurately obtaining machine tools condition monitoring signals,the three schemes of obtai-ning machine tools condition signals were researched,which included external setting sensor signals,internal setting sensor signals and servo drivers monitoring interface signals. The three signals were all reliable signal sources,however,the quality and difficulty level to obtain of signals were different. These three schemes were compared and analyzed. The analysis result has reference significance for re-searching the monitoring technology on-line of CNC machine tools servo feed system and improving reliability and machining quality of machine tools.

  2. Online Analysis of Wind and Solar Part II: Transmission Tool

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  3. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  4. Enterprise KM System: IT based Tool for Nuclear Malaysia

    International Nuclear Information System (INIS)

    Implementation of right and suitable tool for enterprise Knowledge Management (KM) system to an organization is not an easy task. Everything needs to be taken into account before its implementation come true. One of them is to ensure full cooperation is given by the whole entire organization to succeed the knowledge sharing culture utilizing the tool. From selection of potential tools until the implementation and deployment strategies, these shall be thoroughly and carefully organized. A study of choosing the suitable tools and those strategies has been done in Nuclear Malaysia as resulted from Process Oriented Knowledge Management (POKM) project. As far as enterprise KM system is concerned, Microsoft Share Point technology is one of the potential tools in this context. This paper articulates approach and methodology of choosing the technology including its planning, deployment and implementation strategies. (author)

  5. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  6. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.;

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  7. Analyzing Real-Time Systems: Theory and Tools

    DEFF Research Database (Denmark)

    Hune, Thomas Seidelin

    actions take place, but also the timing of the actions. The formal reasoning presented here is based on (extensions of) the model of timed automata and tools supporting this model, mainly UPPAAL. Real-time systems are often part of safety critical systems e.g. control systems for planes, trains......The main topic of this dissertation is the development and use of methods for formal reasoning about the correctness of real-time systems, in particular methods and tools to handle new classes of problems. In real-time systems the correctness of the system does not only depend on the order in which......, or factories, though also everyday electronics as audio/video equipment and (mobile) phones are considered real-time systems. Often these systems are concurrent systems with a number of components interacting, and reasoning about such systems is notoriously difficult. However, since most of the systems...

  8. Analysis of Facial Injuries Caused by Power Tools.

    Science.gov (United States)

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  9. Metabolic engineering with systems biology tools to optimize production of prokaryotic secondary metabolites

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Charusanti, Pep; Lee, Sang Yup;

    2016-01-01

    Metabolic engineering using systems biology tools is increasingly applied to overproduce secondary metabolites for their potential industrial production. In this Highlight, recent relevant metabolic engineering studies are analyzed with emphasis on host selection and engineering approaches...... for the optimal production of various prokaryotic secondary metabolites: native versus heterologous hosts (e.g., Escherichia coli) and rational versus random approaches. This comparative analysis is followed by discussions on systems biology tools deployed in optimizing the production of secondary metabolites....... The potential contributions of additional systems biology tools are also discussed in the context of current challenges encountered during optimization of secondary metabolite production....

  10. Metabolic engineering with systems biology tools to optimize production of prokaryotic secondary metabolites.

    Science.gov (United States)

    Kim, Hyun Uk; Charusanti, Pep; Lee, Sang Yup; Weber, Tilmann

    2016-08-27

    Covering: 2012 to 2016Metabolic engineering using systems biology tools is increasingly applied to overproduce secondary metabolites for their potential industrial production. In this Highlight, recent relevant metabolic engineering studies are analyzed with emphasis on host selection and engineering approaches for the optimal production of various prokaryotic secondary metabolites: native versus heterologous hosts (e.g., Escherichia coli) and rational versus random approaches. This comparative analysis is followed by discussions on systems biology tools deployed in optimizing the production of secondary metabolites. The potential contributions of additional systems biology tools are also discussed in the context of current challenges encountered during optimization of secondary metabolite production. PMID:27072921

  11. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    , as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area......The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  12. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  13. Method and Tools for Development of Advanced Instructional Systems

    NARCIS (Netherlands)

    Arend, J. van der; Riemersma, J.B.J.

    1994-01-01

    The application of advanced instructional systems (AISs), like computer-based training systems, intelligent tutoring systems and training simulators, is widely spread within the Royal Netherlands Army. As a consequence there is a growing interest in methods and tools to develop effective and efficie

  14. Communication and control tools, systems, and new dimensions

    CERN Document Server

    MacDougall, Robert; Cummings, Kevin

    2015-01-01

    Communication and Control: Tools, Systems, and New Dimensions advocates a systems view of human communication in a time of intelligent, learning machines. This edited collection sheds new light on things as mundane yet still profoundly consequential (and seemingly "low-tech") today as push buttons, pagers and telemarketing systems. Contributors also investigate aspects of "remote control" related to education, organizational design, artificial intelligence, cyberwarfa

  15. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where....... The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...

  16. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  17. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  18. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  19. Nanocoatings for High-Efficiency Industrial and Tooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Blau, P; Qu, J.; Higdon, C. (Eaton Corporation)

    2011-02-01

    tests on process variants and developed tests to better simulate the applications of interest. ORNL also employed existing lubrication models to better understand hydraulic pump frictional behavior and test results. Phase III, “Functional Testing” focused on finalizing the strategy for commercialization of AlMgB14 coatings for both hydraulic and tooling systems. ORNL continued to provide tribology testing and analysis support for hydraulic pump applications. It included both laboratory-scale coupon testing and the analysis of friction and wear data from full component-level tests performed at Eaton Corp. Laboratory-scale tribology test methods are used to characterize the behavior of nanocomposite coatings prior to running them in full-sized hydraulic pumps. This task also includes developing tribosystems analyses, both to provide a better understanding of the performance of coated surfaces in alternate hydraulic fluids, and to help design useful laboratory protocols. Analysis also includes modeling the lubrication conditions and identifying the physical processes by which wear and friction of the contact interface changes over time. This final report summarizes ORNL’s portion of the nanocomposite coatings development effort and presents both generated data and the analyses that were used in the course of this effort.

  20. Micropollutants in urban watersheds : substance flow analysis as management tool

    Science.gov (United States)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment

  1. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning; Mullins, Michael

    2006-01-01

    possible approaches for working with digital tectonics by means of acoustics: The architects, the architect-engineer or hybrid practitioner and finally a prototype for a possible digital tectonic tool. For the third approach in the case study a prototype digital tectonic tool is tested on the design...... into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers. The paper contains a case study of three...

  2. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  3. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  4. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  5. IBZM tool: a fully automated expert system for the evaluation of IBZM SPECT studies

    Energy Technology Data Exchange (ETDEWEB)

    Buchert, Ralph; Wilke, Florian; Martin, Brigitte; Borczyskowski, Daniel von; Mester, Janos; Brenner, Winfried; Clausen, Malte [University Medical Center Hamburg-Eppendorf, Department of Nuclear Medicine, Hamburg (Germany); Berding, Georg [University School of Medicine Hannover, Department of Nuclear Medicine, Hannover (Germany)

    2006-09-15

    Visual reading of [{sup 123}I]IBZM SPECT scans depends on the experience of the interpreter. Therefore, semi-quantification of striatal IBZM uptake is commonly considered mandatory. However, semi-quantification is time consuming and prone to error, particularly if the volumes of interest (VOIs) are positioned manually. Therefore, the present paper proposes a new software tool (''IBZM tool'') for fully automated and standardised processing, evaluation and documentation of [{sup 123}I]IBZM SPECT scans. The IBZM tool is an easy-to-use SPM toolbox. It includes automated procedures for realignment and summation of multiple frames (motion correction), stereotactic normalisation, scaling, VOI analysis of striatum-to-reference ratio R, classification of R and standardised display. In order to evaluate the tool, which was developed at the University of Hamburg, the tool was transferred to the University of Hannover. There it was applied to 27 well-documented subjects: eight patients with multi-system atrophy (MSA), 12 patients with Parkinson's disease (PD) and seven controls. The IBZM tool was compared with manual VOI analysis. The sensitivity and specificity of the IBZM tool for the differentiation of the MSA subjects from the controls were 100% and 86%, respectively. The IBZM tool provided improved statistical power compared with manual VOI analysis. The IBZM tool is an expert system for the detection of reduced striatal D{sub 2} availability on [{sup 123}I]IBZM SPECT scans. The standardised documentation supports visual and semi-quantitative evaluation, and it is useful for presenting the findings to the referring physician. The IBZM tool has the potential for widespread use, since it appears to be fairly independent of the performance characteristics of the particular SPECT system used. The tool is available free of charge. (orig.)

  6. Technology Assessment Tool - An Application of Systems Engineering to USDOE Technology Proposals

    Energy Technology Data Exchange (ETDEWEB)

    Rynearson, Michael Ardel

    1999-06-01

    This paper discusses the system design for a Technology Assessment (TA) tool that can be used to quantitatively evaluate new and advanced technologies, products, or processes. Key features of the tool include organization of information in an indentured hierarchy; questions and categories derived from the decomposition of technology performance; segregation of life-cycle issues into six assessment categories; and scoring, relative impact, and sensitivity analysis capability. An advantage of the tool's use is its ability to provide decision analysis data, based on incomplete or complete data.

  7. Technology Assessment Tool - An Application of Systems Engineering to USDOE Technology Proposals

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Rynearson

    1999-06-01

    This paper discusses the system design of a Technology Assessment (TA) tool that can be used to quantitatively evaluate new and advanced technologies, products, or processes. Key features of the tool include organization of information in an indentured hierarchy; questions and categories derived from the decomposition of technology performance; segregation of life-cycle issues into six assessment categories; and scoring, relative impact, and sensitivity analysis capability. An advantage of the tool's use is its ability to provide decision analysis data, based on incomplete or complete data.

  8. IIS--Integrated Interactome System: a web-based platform for the annotation, analysis and visualization of protein-metabolite-gene-drug interactions by integrating a variety of data sources and tools.

    Directory of Open Access Journals (Sweden)

    Marcelo Falsarella Carazzolle

    Full Text Available High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted.We describe here the Integrated Interactome System (IIS, an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system; (ii Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web.We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with

  9. Eating tools in hand activate the brain systems for eating action: a transcranial magnetic stimulation study.

    Science.gov (United States)

    Yamaguchi, Kaori; Nakamura, Kimihiro; Oga, Tatsuhide; Nakajima, Yasoichi

    2014-07-01

    There is increasing neuroimaging evidence suggesting that visually presented tools automatically activate the human sensorimotor system coding learned motor actions relevant to the visual stimuli. Such crossmodal activation may reflect a general functional property of the human motor memory and thus can be operating in other, non-limb effector organs, such as the orofacial system involved in eating. In the present study, we predicted that somatosensory signals produced by eating tools in hand covertly activate the neuromuscular systems involved in eating action. In Experiments 1 and 2, we measured motor evoked response (MEP) of the masseter muscle in normal humans to examine the possible impact of tools in hand (chopsticks and scissors) on the neuromuscular systems during the observation of food stimuli. We found that eating tools (chopsticks) enhanced the masseter MEPs more greatly than other tools (scissors) during the visual recognition of food, although this covert change in motor excitability was not detectable at the behavioral level. In Experiment 3, we further observed that chopsticks overall increased MEPs more greatly than scissors and this tool-driven increase of MEPs was greater when participants viewed food stimuli than when they viewed non-food stimuli. A joint analysis of the three experiments confirmed a significant impact of eating tools on the masseter MEPs during food recognition. Taken together, these results suggest that eating tools in hand exert a category-specific impact on the neuromuscular system for eating.

  10. Eating tools in hand activate the brain systems for eating action: a transcranial magnetic stimulation study.

    Science.gov (United States)

    Yamaguchi, Kaori; Nakamura, Kimihiro; Oga, Tatsuhide; Nakajima, Yasoichi

    2014-07-01

    There is increasing neuroimaging evidence suggesting that visually presented tools automatically activate the human sensorimotor system coding learned motor actions relevant to the visual stimuli. Such crossmodal activation may reflect a general functional property of the human motor memory and thus can be operating in other, non-limb effector organs, such as the orofacial system involved in eating. In the present study, we predicted that somatosensory signals produced by eating tools in hand covertly activate the neuromuscular systems involved in eating action. In Experiments 1 and 2, we measured motor evoked response (MEP) of the masseter muscle in normal humans to examine the possible impact of tools in hand (chopsticks and scissors) on the neuromuscular systems during the observation of food stimuli. We found that eating tools (chopsticks) enhanced the masseter MEPs more greatly than other tools (scissors) during the visual recognition of food, although this covert change in motor excitability was not detectable at the behavioral level. In Experiment 3, we further observed that chopsticks overall increased MEPs more greatly than scissors and this tool-driven increase of MEPs was greater when participants viewed food stimuli than when they viewed non-food stimuli. A joint analysis of the three experiments confirmed a significant impact of eating tools on the masseter MEPs during food recognition. Taken together, these results suggest that eating tools in hand exert a category-specific impact on the neuromuscular system for eating. PMID:24835403

  11. Integration between a sales support system and a simulation tool

    OpenAIRE

    Wahlström, Ola

    2005-01-01

    InstantPlanner is a sales support system for the material handling industry, visualizing and calculating designs faster and more correctly than other tools on the market. AutoMod is a world leading simulation tool used in the material handling industry to optimize and calculate appropriate configuration designs. Both applications are favorable in their own area provide a great platform for integration with the properties of fast designing, correct product calculations, great simulation capabi...

  12. Topics in expert system design methodologies and tools

    CERN Document Server

    Tasso, C

    1989-01-01

    Expert Systems are so far the most promising achievement of artificial intelligence research. Decision making, planning, design, control, supervision and diagnosis are areas where they are showing great potential. However, the establishment of expert system technology and its actual industrial impact are still limited by the lack of a sound, general and reliable design and construction methodology.This book has a dual purpose: to offer concrete guidelines and tools to the designers of expert systems, and to promote basic and applied research on methodologies and tools. It is a coordinated coll

  13. Information systems project management: methods, tools, and techniques

    OpenAIRE

    McManus, John; Wood-Harper, Trevor

    2004-01-01

    Information Systems Project Management offers a clear and logical exposition of how to plan, organise and monitor projects effectively in order to deliver quality information systems within time, to budget and quality. This new book by John McManus and Trevor Wood-Harper is suitable for upper level undergraduates and postgraduates studying project management and Information Systems. Practising managers will also find it to be a valuable tool in their work. Managing information systems pro...

  14. Tool, weapon, or white elephant? A realist analysis of the five phases of a twenty-year programme of occupational health information system implementation in the health sector

    OpenAIRE

    Spiegel Jerry M; Lockhart Karen; Dyck Carmen; Wilson Andrea; O’Hara Lyndsay; Yassi Annalee

    2012-01-01

    Abstract Background Although information systems (IS) have been extensively applied in the health sector worldwide, few initiatives have addressed the health and safety of health workers, a group acknowledged to be at high risk of injury and illness, as well as in great shortage globally, particularly in low and middle-income countries. Methods Adapting a context-mechanism-outcome case study design, we analyze our team’s own experience over two decades to address this gap: in two different Ca...

  15. Decision Making Support in Wastewater Management - Comparative analysis of techniques and tools used in centralized and decentralized system layouts, UDK 628.2

    OpenAIRE

    Harmony Musiyarira; Cornelius Chris Reynders; Prvoslav Marjanovic

    2012-01-01

    Wastewater management has been seen primarily as a technical and economic issue but it is now recognised that these are some of the elements in an array of other factors that affect sustainability of wastewater systems. Literature studies point out that municipal authorities have a general and long-standing tradition of using indicators in monitoring performance, reviewing progress and reporting the state of the environment as part of the regulatory enacted compliance. However, they have negl...

  16. ISWHM: Tools and Techniques for Software and System Health Management

    Science.gov (United States)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  17. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from......Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  18. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    Science.gov (United States)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  19. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  20. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  1. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  2. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2002-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re

  3. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    . Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  4. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  5. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  6. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  7. Raman spectroscopy as an effective tool for high-resolution heavy-mineral analysis: examples from major Himalayan and Alpine fluvio-deltaic systems.

    Science.gov (United States)

    Andò, Sergio; Bersani, Danilo; Vignola, Pietro; Garzanti, Eduardo

    2009-08-01

    Raman spectroscopy represents a new way to obtain detailed comprehensive information on heavy-mineral assemblages. In this work are presented several examples from major Alpine (Po River) and Himalayan (Ganga and Brahmaputra Rivers) fluvio-deltaic sands. Our attention was focused on the chemical properties of garnet, which is a widespread mineral in orogenic sediments, easy to be identified, and relatively stable during both equatorial weathering and intrastratal dissolution. Garnet grains were studied in different samples representative of various depositional environments (fluvial bar, fluvial levee, shoreface, beach berm, eolian dune), in order to investigate specifically the hydraulic behaviour of grains with different density in different hydrodynamic conditions. Raman spectra and semi-quantitative analysis of Raman shifts allowed us to rapidly determine the distribution of garnet types in each sample in order to obtain chemical composition, to calculate the density of each garnet, and finally to infer their respective provenance. This manuscript presents one possible application of the "MIRAGEM" method described by Bersani et al. in this volume. References, data sets and details on the analytical routine are widely explained in the above mentioned work. PMID:19111499

  8. DESIGN AND CAD SYSTEM OF THE TOOL FOR DRILL FLUTE

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the principles of differential geometry and kinematics, a mathematical model is developed to describe the grinding wheel axial cross-section with the radial cross-section of the flute in a given drill under the basic engagement condition between the generating flute and the generated grinding wheel (or disk milling tool). The mathematical model is good for the flute in the radial cross-section consisting of three arcs. Furthermore, a CAD system is also developed to represent the axial cross-section of the grinding wheel (or disk milling tool). With the system, the grinding wheel (or disk milling tool) axial cross-section that corresponds to the three-arc flute cross section can be conveniently simulated. Through the grinding experiment of drill flutes, the method and the CAD system are proved to be feasible and reasonable.

  9. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  10. A Automated Tool for Supporting FMEAs of Digital Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.; Lehner, J.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach, an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.

  11. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    Science.gov (United States)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  12. Multilingual lexicon design tool and database management system for MT

    CERN Document Server

    Barisevičius, G

    2011-01-01

    The paper presents the design and development of English-Lithuanian-English dictionarylexicon tool and lexicon database management system for MT. The system is oriented to support two main requirements: to be open to the user and to describe much more attributes of speech parts as a regular dictionary that are required for the MT. Programming language Java and database management system MySql is used to implement the designing tool and lexicon database respectively. This solution allows easily deploying this system in the Internet. The system is able to run on various OS such as: Windows, Linux, Mac and other OS where Java Virtual Machine is supported. Since the modern lexicon database managing system is used, it is not a problem accessing the same database for several users.

  13. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  14. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  15. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  16. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  17. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  18. A Spreadsheet Teaching Tool For Analysis Of Pipe Networks

    OpenAIRE

    El Bahrawy, Aly N.

    1997-01-01

    Spreadsheets are used widely in engineering to perform several analysis and design calculations. They are also very attractive as educational tools due to their flexibility and efficiency. This paper demonstrates the use of spreadsheets in teaching the analysis of water pipe networks, which involves the calculation of pipe flows or nodal heads given the network layout, pipe characteristics (diameter, length, and roughness), in addition to external flows. The network performance is better und...

  19. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  20. Towards a new tool for measuring Safety Management Systems performance

    OpenAIRE

    Cambon, Julien; Guarnieri, Franck; Groeneweg, Jop

    2006-01-01

    Available on: http://www.resilience-engineering.org/REPapers/Cambon_Guarnieri_Groeneweg_P.pdf International audience This paper deals with the assessment of Safety Management Systems performance and presents a new tool developed for that purpose. It recognizes two dimensions in a SMS: a structural facet corresponding to the formal description of the system and an operational one focused on the system's influence on the working environment and practices of people. Building up the operati...

  1. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology, a...

  2. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  3. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    Science.gov (United States)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  4. Dynamic wind turbine models in power system simulation tool DIgSILENT

    OpenAIRE

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar; Iov, F.; Blaabjerg, F.

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The repo...

  5. A Covariance Analysis Tool for Assessing Fundamental Limits of SIM Pointing Performance

    Science.gov (United States)

    Bayard, David S.; Kang, Bryan H.

    2007-01-01

    This paper presents a performance analysis of the instrument pointing control system for NASA's Space Interferometer Mission (SIM). SIM has a complex pointing system that uses a fast steering mirror in combination with a multirate control architecture to blend feed forward information with feedback information. A pointing covariance analysis tool (PCAT) is developed specifically to analyze systems with such complexity. The development of PCAT as a mathematical tool for covariance analysis is outlined in the paper. PCAT is then applied to studying performance of SIM's science pointing system. The analysis reveals and clearly delineates a fundamental limit that exists for SIM pointing performance. The limit is especially stringent for dim star targets. Discussion of the nature of the performance limit is provided, and methods are suggested to potentially improve pointing performance.

  6. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  7. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  8. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  9. An integrated data analysis tool for improving measurements on the MST RFP

    International Nuclear Information System (INIS)

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method

  10. Danish heat atlas as a support tool for energy system models

    DEFF Research Database (Denmark)

    Petrovic, Stefan; Karlsson, Kenneth Bernard

    2014-01-01

    infrastructure investments, such as the expansion of district heating networks and the introduction of significant heat saving measures require highly detailed decision-support tool. A Danish heat atlas provides highly detailed database with extensive information about more than 2.5 million buildings in Denmark....... Energy system analysis tools incorporate environmental, economic, energy and engineering analysis of future energy systems and are considered crucial for the quantitative assessment of transitional scenarios towards future milestones, such as EU 2020 goals and Denmark’s goal of achieving fossil free...

  11. A Scheduling System Based on Rules of the Machine Tools in FMS

    Institute of Scientific and Technical Information of China (English)

    LI De-xin; ZHAO Hua-qun; JIA Jie; LU Yan-jun

    2003-01-01

    In this paper, a model of the scheduling of machine tools in the flexible manufacturing line is presented by intensive analysis and research of the mathematical method of traditional scheduling. The various factors correlative with machine tools in the flexible manufacturing line are fully considered in this system. Aiming at this model, an intelligent decision system based on rules and simulation technolo-gy integration is constructed by using the OOP ( Object-Orented Programming) method, and the simula-tion experiment analysis is carried out. It is shown from the results that the model is better in practice.

  12. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  13. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  14. A study of an intelligent FME system for SFCR tools

    International Nuclear Information System (INIS)

    In the nuclear field, the accurate identification, tracking and history documentation of every nuclear tool, equipment or component is a key to safety, operational and maintenance excellence, and security of the nuclear reactor. This paper offers a study of the possible development of the present Foreign Material Exclusion (FME) system using an Intelligent Nuclear Tools Identification System, (INTIS), that was created and customized for the Single Fuel Channel Replacement (SFCR) Tools. The conceptual design of the INTIS was presented comparing the current and the proposed systems in terms of the time, the cost and the radiation doses received by the employees during the SFCR maintenance jobs. A model was created to help better understand and analyze the effects of deployment of the INTIS on the time, performance, accuracy, received dose and finally the total cost. The model may be also extended to solve other nuclear applications problems. The INTIS is based on Radio Frequency Identification (RFID) Smart Tags which are networked with readers and service computers. The System software was designed to communicate with the network to provide the coordinate information for any component at any time. It also allows digital signatures for use and/or approval to use the components and automatically updates their Data Base Management Systems (DBMS) history in terms of the person performing the job, the time period and date of use. This feature together with the information of part's life span could be used in the planning process for the predictive and preventive maintenance. As a case study, the model was applied to a pilot project for SFCR Tools FME. The INTIS automatically records all the tools to be used inside the vault and make real time tracking of any misplaced tool. It also automatically performs a continuous check of all tools, sending an alarm if any of the tools was left inside the vault after the job is done. Finally, a discussion of the results of the system

  15. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  16. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  17. Toward the development of an image quality tool for active millimeter wave imaging systems

    Science.gov (United States)

    Barber, Jeffrey; Weatherall, James C.; Greca, Joseph; Smith, Barry T.

    2015-05-01

    Preliminary design considerations for an image quality tool to complement millimeter wave imaging systems are presented. The tool is planned for use in confirming operating parameters; confirmation of continuity for imaging component design changes, and analysis of new components and detection algorithms. Potential embodiments of an image quality tool may contain materials that mimic human skin in order to provide a realistic signal return for testing, which may also help reduce or eliminate the need for mock passengers for developmental testing. Two candidate materials, a dielectric liquid and an iron-loaded epoxy, have been identified and reflection measurements have been performed using laboratory systems in the range 18 - 40 GHz. Results show good agreement with both laboratory and literature data on human skin, particularly in the range of operation of two commercially available millimeter wave imaging systems. Issues related to the practical use of liquids and magnetic materials for image quality tools are discussed.

  18. A thermodynamic evaluation of chilled water central air conditioning systems using artificial intelligence tools

    Directory of Open Access Journals (Sweden)

    Juan Carlos Armas

    2011-05-01

    Full Text Available  An analysis of a chilled water central air conditioning system is presented. The object was to calculate main cycle component irreversibility, as well as evaluating this indicator’s sensitivity to operational variations. Artificial neural networks (ANN, genetic algorithms (GA and Matlab tools were used to calculate refrigerant thermodynamic properties during each cycle stage. These tools interacted with equations describing the system’s thermodynamic behaviour. Refrigerant temperature, when released from the compressor, was determined by a hybrid model combining the neural model with a simple genetic algorithm used as optimisation tool; the cycle’s components which were most sensitive to changes in working conditions were identified. It was concluded that the compressor, evaporator and expansion mechanism (in that order represented significant exergy losses reaching 85.62% of total system irreversibility. A very useful tool was thus developed for evaluating these systems

  19. Power electronic systems Walsh analysis with Matlab

    CERN Document Server

    Deb, Anish

    2014-01-01

    A Totally Different Outlook on Power Electronic System AnalysisPower Electronic Systems: Walsh Analysis with MATLAB® builds a case for Walsh analysis as a powerful tool in the study of power electronic systems. It considers the application of Walsh functions in analyzing power electronic systems, and the advantages offered by Walsh domain analysis of power electronic systems. Solves Power Electronic Systems in an Unconventional WayThis book successfully integrates power electronics as well as systems and control. Incorporating a complete orthonormal function set very much unlike the sine-cosin

  20. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm . PMID:27248514

  1. An Intelligent Tool to support Requirements Analysis and Conceptual Design of Database Design

    Institute of Scientific and Technical Information of China (English)

    王能斌; 刘海青

    1991-01-01

    As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification language NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool,NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.

  2. Spotlighting fantasy literature with the tools of Frame Semantics and Systemic Functional Linguistics: A case study

    OpenAIRE

    Luporini, Antonella

    2016-01-01

    This paper presents a dual approach to the stylistic analysis of literary texts, focusing on fantasy literature, and deploying the tools provided by two arguably complementary theoretical-descriptive models: Frame Semantics (FS; Fillmore 1985; 2006 [1982]; Fillmore and Baker 2010) and the system of TRANSITIVITY as developed within Systemic Functional Linguistics (SFL; Halliday and Matthiessen 1999; 2014). The frameworks are applied to the analysi...

  3. A DVD authoring tool for GNU/Linux desktop systems

    OpenAIRE

    Mas i Hernàndez, Jordi

    2009-01-01

    Projecte denominat Mistelix, una eina d'autoria de DVD en codi obert per a sistemes GNU / Linux. Proyecto denominado Mistelix, una herramienta de autoría de DVD en código abierto para sistemas GNU/Linux. Project called Mistelix, an open source DVD authoring tool for GNU / Linux systems.

  4. Faculty Usage of Library Tools in a Learning Management System

    Science.gov (United States)

    Leeder, Chris; Lonn, Steven

    2014-01-01

    To better understand faculty attitudes and practices regarding usage of library-specific tools and roles in a university learning management system, log data for a period of three semesters was analyzed. Academic departments with highest rates of usage were identified, and faculty users and nonusers within those departments were surveyed regarding…

  5. ASAP methodology and tools of the company EPAM Systems

    OpenAIRE

    Loginovskaja, Anna

    2008-01-01

    The thesis answers the question: - why companies, which implement software SAP, use methodology Accelerated SAP and do not use international standards for project management, why has SAP described his own methodology, and what benefits gets the company, which has its own methodology and manage projects according to her? - what benefits is getting EPAM Systems from its own tools, which are used in project management?

  6. Developing a Decision Support System: The Software and Hardware Tools.

    Science.gov (United States)

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  7. Monitoring SOA Applications with SOOM Tools: A Competitive Analysis

    OpenAIRE

    Ivan Zoraja; Goran Trlin; Marko Matijević

    2013-01-01

    Background: Monitoring systems decouple monitoring functionality from application and infrastructure layers and provide a set of tools that can invoke operations on the application to be monitored. Objectives: Our monitoring system is a powerful yet agile solution that is able to online observe and manipulate SOA (Service-oriented Architecture) applications. The basic monitoring functionality is implemented via lightweight components inserted into SOA frameworks thereby keeping the monitoring...

  8. Film analysis systems and applications

    Energy Technology Data Exchange (ETDEWEB)

    Yonekura, Y.; Brill, A.B.

    1981-01-01

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool. (ERB)

  9. Film analysis systems and applications

    International Nuclear Information System (INIS)

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool

  10. Simulation as a decisison support tool in maintenance float systems

    OpenAIRE

    Pereira, Guilherme; Peito, Francisco; Leitão, Armando; Dias, Luís M. S.

    2011-01-01

    This paper is concerned with the use of simulation as a decision support tool in maintenance systems, specifically in MPS (Maintenance Float Systems). For this purpose and due to its high complexity, in this paper the authors explore and present a possible way to construct a MPS model using Arena® simulation language, where some of the most common performance measures are identified, calculated and analysed.

  11. Simulation as a decision support tool in maintenance float systems

    OpenAIRE

    Peito, Francisco; Pereira, Guilherme; Leitão, Armando; Dias, Luís M. S.

    2011-01-01

    This paper is concerned with the use of simulation as a decision support tool in maintenance systems, specifically in MFS (Maintenance Float Systems). For this purpose and due to its high complexity, in this paper the authors explore and present a possible way to construct a MFS model using Arena® simulation language, where some of the most common performance measures are identified, calculated and analysed.

  12. AstroStat - A VO Tool for Statistical Analysis

    CERN Document Server

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  13. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  14. NC flame pipe cutting machine tool based on open architecture CNC system

    Institute of Scientific and Technical Information of China (English)

    Xiaogen NIE; Yanbing LIU

    2009-01-01

    Based on the analysis of the principle and flame movement of a pipe cutting machine tool, a retrofit NC flame pipe cutting machine tool (NFPCM) that can meet the demands of cutting various pipes is proposed. The paper deals with the design and implementation of an open architecture CNC system for the NFPCM, many of whose aspects are similar to milling machines; however, different from their machining processes and control strategies. The paper emphasizes on the NC system structure and the method for directly creating the NC file according to the cutting type and parameters. Further, the paper develops the program and sets up the open and module NC system.

  15. The study of opened CNC system of turning-grinding composite machine tool based on UMAC

    Science.gov (United States)

    Wang, Hongjun; Han, Qiushi; Wu, Guoxin; Ma, Chao

    2011-05-01

    The general function analysis of a turning-grinding composite machine tool (TGCM) is done. The structure of the TGCM based on 'process integration with one setup' theory in this paper is presented. The CNC system functions of TGCM are analyzed and the CNC framework of TGCM is discussed. Finally the opened-CNC system for this machine tool is developed based on UMAC (Universal Motion and Automation Controller) included hardware system and software system. The hardware structure layout is put forward and the software system is implemented by using VC++6.0. The hardware system was composed of IPC and UMAC. The general control system meets the requirement of integrity machining and matches the hardware structure system of TGCM. The practical machining experiment results showed that the system is valid with high accuracy and high reliability.

  16. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  17. AUTOMATIC TOOL-CHANGING WITHIN THE RECONFIGURABLE MANUFACTURING SYSTEMS PARADIGM

    Directory of Open Access Journals (Sweden)

    J.E.T. Collins

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Reconfigurable manufacturing systems were developed as a proposed solution to the varying market and customer requirements present in today’s global market. The systems are designed to offer adaptability in machining functions and processes. This adaptive capability requires access to a selection of tools. The development of reconfigurable manufacturing systems has mainly been focused on the machine tools themselves. Methods of supplying tools to these machines need to be researched. This paper does so, presenting a tool-changing unit that offers a solution to this need. It then discusses the enabling technologies that would allow for automatic integration and diagnostic abilities of the unit.

    AFRIKAANSE OPSOMMING: Herkonfigureerbare vervaardingstelsels is ontwikkel as ’n voorgestelde oplossing vir die varierende mark- en klantbehoeftes in die hedendaagse globale mark. Die stelsels is ontwikkel om aanpasbaarheid te bied ten opsigte van masjineringsfunksies en –prosesse. Hierdie aanpasbare vermoëns vereis egter toegang tot ‘n verskeidenheid van gereedskapstukke. Die ontwikkeling van herkonfigureerbare vervaardigingstelsels het egter hoofsaaklik gefokus op die gereedskapstukke. Die wyse waarop hierdie gereedskapstukke beskikbaar gestel word aan die masjinerie moet egter nagevors word. Hierdie artikel doen juis dit en stel ‘n eenheid voor vir die ruiling van gereedskapstukke. Voorts word die tegnologieë bespreek wat automatiese integrasie moontlik maak en diagnostiese vermoëns verskaf.

  18. Physics Analysis Tools for the CMS experiment at LHC

    CERN Document Server

    Fabozzi, Francesco; Hegner, Benedikt; Lista, Luca

    2008-01-01

    The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.

  19. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of the framework. The issue of commercial simulators or software providing the necessary features for product-process synthesis-design as opposed to their development by the academic PSE community will also be discussed. An example of a successful collaboration between academia-industry for the development...

  20. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  1. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  2. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  3. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel s...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  4. Validation of retrofit analysis simulation tool: Lessons learned

    OpenAIRE

    Trcka, Marija; Pasini, Jose Miguel; Oggianu, Stella Maris

    2014-01-01

    It is well known that residential and commercial buildings account for about 40% of the overall energy consumed in the United States, and about the same percentage of CO2 emissions. Retrofitting existing old buildings, which account for 99% of the building stock, represents the best opportunity of achieving challenging energy and emission targets. United Technologies Research Center (UTC) has developed a methodology and tool that provides computational support for analysis and decision-making...

  5. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  6. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  7. Geographic Information System Tools for Conservation Planning: User's Manual

    Science.gov (United States)

    Fox, Timothy J.; Rohweder, Jason J.; Kenow, K.P.; Korschgen, C.E.; DeHaan, H.C.

    2003-01-01

    Public and private land managers desire better ways to incorporate landscape, species, and habitat relations into their conservation planning processes. We present three tools, developed for the Environmental Systems Research Institute?s ArcView 3.x platform, applicable to many types of wildlife conservation management and planning efforts. These tools provide managers and planners with the ability to rapidly assess landscape attributes and link these attributes with species-habitat information. To use the tools, the user provides a detailed land cover spatial database and develops a matrix to identify species-habitat relations for the landscape of interest. The tools are applicable to any taxa or suite of taxa for which the required data are available. The user also has the ability to interactively make polygon-specific changes to the landscape and re-examine species-habitat relations. The development of these tools has given resource managers the means to evaluate the merits of proposed landscape management scenarios and to choose the scenario that best fits the goals of the managed area.

  8. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  9. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  10. AstroStat-A VO tool for statistical analysis

    Science.gov (United States)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  11. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  12. Graphical and Normative Analysis of Binocular Vision by Mini Computer: A Teaching Aid and Clinical Tool.

    Science.gov (United States)

    Kees, Martin; Schor, Clifton

    1981-01-01

    An inexpensive computer graphics systems (Commodore PET), used as a video aid for teaching students advanced case analysis, is described. The course provides students with the analytical tools for evaluating with graphical and statistical techniques and treating with lenses, prisms, and orthoptics various anomalies of binocular vision. (MLW)

  13. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  14. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  15. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  16. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  17. The ALICE analysis train system

    CERN Document Server

    Zimmermann, Markus

    2015-01-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  18. Development of meso-scale milling machine tool and its performance analysis

    Institute of Scientific and Technical Information of China (English)

    Hongtao LI; Xinmin LAI; Chengfeng LI; Zhongqin LIN; Jiancheng MIAO; Jun NI

    2008-01-01

    To overcome the shortcomings of current technologies for meso-scale manufacturing such as MEMS and ultra precision machining, this paper focuses on the investigations on the meso milling process with a miniaturized machine tool. First, the related technologies for the process mechanism studies are investigated based on the analysis of the characteristics of the meso milling process. An overview of the key issues is presented and research approaches are also proposed. Then, a meso-scale milling machine tool system is developed. The subsystems and their specifications are described in detail. Finally, some tests are conducted to evaluate the performance of the system. These tests consist of precision measurement of the positioning subsystem, the test for machining precision evaluation, and the experiments for machining mechanical parts with com-plex features. Through test analysis, the meso milling process with a miniaturized machine tool is proved to be feasible and applicable for meso manufacturing.

  19. Graphical tools for network meta-analysis in STATA.

    Science.gov (United States)

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  20. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  1. Controlling open quantum systems: tools, achievements, and limitations

    Science.gov (United States)

    Koch, Christiane P.

    2016-06-01

    The advent of quantum devices, which exploit the two essential elements of quantum physics, coherence and entanglement, has sparked renewed interest in the control of open quantum systems. Successful implementations face the challenge of preserving relevant nonclassical features at the level of device operation. A major obstacle is decoherence, which is caused by interaction with the environment. Optimal control theory is a tool that can be used to identify control strategies in the presence of decoherence. Here we review recent advances in optimal control methodology that allow typical tasks in device operation for open quantum systems to be tackled and discuss examples of relaxation-optimized dynamics. Optimal control theory is also a useful tool to exploit the environment for control. We discuss examples and point out possible future extensions.

  2. Marketing intelligence system a "smart tool" for the companies

    OpenAIRE

    Grigorut Cornel; Grigorut Lavinia-Maria; Surugiu Felicia

    2012-01-01

    Marketing Intelligence Systems are tools that allow organizations to conduct a new business, a new integrative vision that includes the customers’ needs, requirements and desires. The activity of the organization should focus on achieving them. The marketing knowledge and information held by the organization about customers, market, competition, suppliers, distribution channels, generally about the environment in which it operates, can be easily processed using those technologies specific to ...

  3. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  4. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  5. Multidimensional Analysis: A Management Tool for Monitoring HIPAA Compliance and Departmental Performance

    OpenAIRE

    Coleman, Robert M.; Ralston, Matthew D.; Szafran, Alexander; Beaulieu, David M.

    2004-01-01

    Most RIS and PACS systems include extensive auditing capabilities as part of their security model, but inspecting those audit logs to obtain useful information can be a daunting task. Manual analysis of audit trails, though cumbersome, is often resorted to because of the difficulty to construct queries to extract complex information from the audit logs. The approach proposed by the authors uses standard off-the-shelf multidimensional analysis software tools to assist the PACS/RIS administrato...

  6. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  7. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  8. Power system analysis

    CERN Document Server

    Murty, PSR

    2007-01-01

    Power system analysis is a pre-requisite course for electrical engineering students. This book introduces concepts of a power system, network model faults and analysis and the primitive network stability. It also deals with graph theory relevant to various incidence matrices, building of network matrices and power flow studies. It further discusses with short circuit analysis, unbalanced fault analysis and power system stability problems, such as, steady state stability, transient stability and dynamic stability. Salient Features: Number of worked examples are followed after explaining theory

  9. Basic statistical tools in research and data analysis

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  10. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  11. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  12. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  13. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  14. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  15. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  16. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  17. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  18. Intelligent Electric Power Systems with Active-Adaptive Electric Networks: Challenges for Simulation Tools

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2015-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of intelligent electric power systems with active-adaptive electric networks (IES including Flexible Alternating Current Transmission System (FACTS devices. The key requirements for the simulation were formed. The presented analysis of simulation results of IES confirms the need to use a hybrid modelling approach.

  19. Marketing residential grid-connected PV systems using a balanced scorecard as a marketing tool

    International Nuclear Information System (INIS)

    A strategic analysis of the electricity market in Western Australia yields a market potential for renewable energy in Western Australia. However, from a purely financial viewpoint the installation of grid-connected pv-systems still is not economically viable. In this paper a balanced scorecard (BSC) is developed to capture and visualize other than financial benefits. Therefore, the BSC can be used as a marketing tool to communicate the benefits of a privately owned GCPV system to potential customers. (author)

  20. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  1. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U;

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...... the quantitative phylogenetic and functional compositions of metagenomes, to compare compositions of multiple metagenomes and to produce intuitive visual representations of such analyses. AVAILABILITY: SmashCommunity is freely available at http://www.bork.embl.de/software/smash CONTACT: bork@embl.de....

  2. Reliability review of the remote tool delivery system locomotor

    Energy Technology Data Exchange (ETDEWEB)

    Chesser, J.B.

    1999-04-01

    The locomotor being built by RedZone Robotics is designed to serve as a remote tool delivery (RID) system for waste retrieval, tank cleaning, viewing, and inspection inside the high-level waste tanks 8D-1 and 8D-2 at West Valley Nuclear Services (WVNS). The RTD systm is to be deployed through a tank riser. The locomotor portion of the RTD system is designed to be inserted into the tank and is to be capable of moving around the tank by supporting itself and moving on the tank internal structural columns. The locomotor will serve as a mounting platform for a dexterous manipulator arm. The complete RTD system consists of the locomotor, dexterous manipulator arm, cameras, lights, cables, hoses, cable/hose management system, power supply, and operator control station.

  3. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  4. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Energy Technology Data Exchange (ETDEWEB)

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  5. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  6. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  7. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  8. A tool for finite element deflection analysis of wings

    Energy Technology Data Exchange (ETDEWEB)

    Carlen, Ingemar

    2005-03-01

    A first version (ver 0.1) of a new tool for finite element deflection analysis of wind turbine blades is presented. The software is called SOLDE (SOLid blaDE), and was developed as a Matlab shell around the free finite element codes CGX (GraphiX - pre-processor), and CCX (CrunchiX - solver). In the present report a brief description of SOLDE is given, followed by a basic users guide. The main features of SOLDE are: - Deflection analysis of wind turbine blades, including 3D effects and warping. - Accurate prediction of eigenmodes and eigenfrequencies. - Derivation of 2-node slender elements for use in various aeroelastic analyses. The main differences between SOLDE and other similar tools can be summarised as: - SOLDE was developed without a graphical user interface or a traditional text file input deck. Instead the input is organised as Matlab data structures that have to be formed by a user provided pre-processor. - SOLDE uses a solid representation of the geometry instead of a thin shell approximation. The benefit is that the bending-torsion couplings will automatically be correctly captured. However, a drawback with the current version is that the equivalent orthotropic shell idealisation violates the local bending characteristics, which makes the model useless for buckling analyses. - SOLDE includes the free finite element solver CCX, and thus no expensive commercial software (e.g. Ansys, or Nastran) is required to produce results.

  9. Energy Signal Tool for Decision Support in Building Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Henze, G. P.; Pavlak, G. S.; Florita, A. R.; Dodier, R. H.; Hirsch, A. I.

    2014-12-01

    A prototype energy signal tool is demonstrated for operational whole-building and system-level energy use evaluation. The purpose of the tool is to give a summary of building energy use which allows a building operator to quickly distinguish normal and abnormal energy use. Toward that end, energy use status is displayed as a traffic light, which is a visual metaphor for energy use that is either substantially different from expected (red and yellow lights) or approximately the same as expected (green light). Which light to display for a given energy end use is determined by comparing expected to actual energy use. As expected, energy use is necessarily uncertain; we cannot choose the appropriate light with certainty. Instead, the energy signal tool chooses the light by minimizing the expected cost of displaying the wrong light. The expected energy use is represented by a probability distribution. Energy use is modeled by a low-order lumped parameter model. Uncertainty in energy use is quantified by a Monte Carlo exploration of the influence of model parameters on energy use. Distributions over model parameters are updated over time via Bayes' theorem. The simulation study was devised to assess whole-building energy signal accuracy in the presence of uncertainty and faults at the submetered level, which may lead to tradeoffs at the whole-building level that are not detectable without submetering.

  10. System Dynamics in Medical Education: A Tool for Life

    Science.gov (United States)

    Rubin, David M.; Richards, Christopher L.; Keene, Penelope A. C.; Paiker, Janice E.; Gray, A. Rosemary T.; Herron, Robyn F. R.; Russell, Megan J.; Wigdorowitz, Brian

    2012-01-01

    A course in system dynamics has been included in the first year of our university's six-year medical curriculum. System Dynamics is a discipline that facilitates the modelling, simulation and analysis of a wide range of problems in terms of two fundamental concepts viz. rates and levels. Many topics encountered in the medical school curriculum,…

  11. Disposal systems evaluations and tool development : Engineered Barrier System (EBS) evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Rutqvist, Jonny (LBNL); Liu, Hui-Hai (LBNL); Steefel, Carl I. (LBNL); Serrano de Caro, M. A. (LLNL); Caporuscio, Florie Andre (LANL); Birkholzer, Jens T. (LBNL); Blink, James A. (LLNL); Sutton, Mark A. (LLNL); Xu, Hongwu (LANL); Buscheck, Thomas A. (LLNL); Levy, Schon S. (LANL); Tsang, Chin-Fu (LBNL); Sonnenthal, Eric (LBNL); Halsey, William G. (LLNL); Jove-Colon, Carlos F.; Wolery, Thomas J. (LLNL)

    2011-01-01

    Key components of the nuclear fuel cycle are short-term storage and long-term disposal of nuclear waste. The latter encompasses the immobilization of used nuclear fuel (UNF) and radioactive waste streams generated by various phases of the nuclear fuel cycle, and the safe and permanent disposition of these waste forms in geological repository environments. The engineered barrier system (EBS) plays a very important role in the long-term isolation of nuclear waste in geological repository environments. EBS concepts and their interactions with the natural barrier are inherently important to the long-term performance assessment of the safety case where nuclear waste disposition needs to be evaluated for time periods of up to one million years. Making the safety case needed in the decision-making process for the recommendation and the eventual embracement of a disposal system concept requires a multi-faceted integration of knowledge and evidence-gathering to demonstrate the required confidence level in a deep geological disposal site and to evaluate long-term repository performance. The focus of this report is the following: (1) Evaluation of EBS in long-term disposal systems in deep geologic environments with emphasis on the multi-barrier concept; (2) Evaluation of key parameters in the characterization of EBS performance; (3) Identification of key knowledge gaps and uncertainties; and (4) Evaluation of tools and modeling approaches for EBS processes and performance. The above topics will be evaluated through the analysis of the following: (1) Overview of EBS concepts for various NW disposal systems; (2) Natural and man-made analogs, room chemistry, hydrochemistry of deep subsurface environments, and EBS material stability in near-field environments; (3) Reactive Transport and Coupled Thermal-Hydrological-Mechanical-Chemical (THMC) processes in EBS; and (4) Thermal analysis toolkit, metallic barrier degradation mode survey, and development of a Disposal Systems

  12. Nanocoatings for High-Efficiency Industrial Hydraulic and Tooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Clifton B. Higdon III

    2011-01-07

    Industrial manufacturing in the U.S. accounts for roughly one third of the 98 quadrillion Btu total energy consumption. Motor system losses amount to 1.3 quadrillion Btu, which represents the largest proportional loss of any end-use category, while pumps alone represent over 574 trillion BTU (TBTU) of energy loss each year. The efficiency of machines with moving components is a function of the amount of energy lost to heat because of friction between contacting surfaces. The friction between these interfaces also contributes to downtime and the loss of productivity through component wear and subsequent repair. The production of new replacement parts requires additional energy. Among efforts to reduce energy losses, wear-resistant, low-friction coatings on rotating and sliding components offer a promising approach that is fully compatible with existing equipment and processes. In addition to lubrication, one of the most desirable solutions is to apply a protective coating or surface treatment to rotating or sliding components to reduce their friction coefficients, thereby leading to reduced wear. Historically, a number of materials such as diamond-like carbon (DLC), titanium nitride (TiN), titanium aluminum nitride (TiAlN), and tungsten carbide (WC) have been examined as tribological coatings. The primary objective of this project was the development of a variety of thin film nanocoatings, derived from the AlMgB14 system, with a focus on reducing wear and friction in both industrial hydraulics and cutting tool applications. Proof-of-concept studies leading up to this project had shown that the constituent phases, AlMgB14 and TiB2, were capable of producing low-friction coatings by pulsed laser deposition. These coatings combine high hardness with a low friction coefficient, and were shown to substantially reduce wear in laboratory tribology tests. Selection of the two applications was based largely on the concept of improved mechanical interface efficiencies for

  13. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    DEFF Research Database (Denmark)

    Vedel, Søren; Olesen, Laurits Højgaard; Bruus, Henrik

    2010-01-01

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we...... suggest the use of system-level equivalent circuit theory as an adequate theory of the behavior of the system. A novel pressure source capable of operation in the desired frequency range is presented for this generic analysis. As a proof of concept, we study the fairly complex system of water...

  14. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    Science.gov (United States)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  15. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-01

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  16. An intelligent condition monitoring system for on-line classification of machine tool wear

    Energy Technology Data Exchange (ETDEWEB)

    Fu Pan; Hope, A.D.; Javed, M. [Systems Engineering Faculty, Southampton Institute (United Kingdom)

    1997-12-31

    The development of intelligent tool condition monitoring systems is a necessary requirement for successful automation of manufacturing processes. This presentation introduces a tool wear monitoring system for milling operations. The system utilizes power, force, acoustic emission and vibration sensors to monitor tool condition comprehensively. Features relevant to tool wear are drawn from time and frequency domain signals and a fuzzy pattern recognition technique is applied to combine the multisensor information and provide reliable classification results of tool wear states. (orig.) 10 refs.

  17. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  18. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  19. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  20. Clinical Decision Support Systems: A Useful Tool in Clinical Practice

    Directory of Open Access Journals (Sweden)

    Kolostoumpis G.

    2012-01-01

    Full Text Available The possibility of supporting in decision – making shows an increase in recent years. Based on mathematic simulation tools, knowledge databases, processing methods, medical data and methods, artificial intelligence for coding of the available knowledge and for resolving complex problems arising into clinical practice. Aim: the aim of this review is to present the development of new methods and modern services, in clinical practice and the emergence in their implementation. Data and methods: the methodology that was followed included research of articles that referred to health sector and modern technologies, at the electronic data bases “pubmed” and “medline”. Results: Is a useful tool for medical experts using characteristics and medical data used by the doctors. Constitute innovation for the medical community, and ensure the support of clinical decisions with an overall way by providing a comprehensive solution in the light of the integration of computational decision support systems into clinical practice. Conclusions: Decision Support Systems contribute to improving the quality of health services with simultaneous impoundment of costs (i.e. avoid medical errors

  1. The antibody mining toolbox: an open source tool for the rapid analysis of antibody repertoires.

    Science.gov (United States)

    D'Angelo, Sara; Glanville, Jacob; Ferrara, Fortunato; Naranjo, Leslie; Gleasner, Cheryl D; Shen, Xiaohong; Bradbury, Andrew R M; Kiss, Csaba

    2014-01-01

    In vitro selection has been an essential tool in the development of recombinant antibodies against various antigen targets. Deep sequencing has recently been gaining ground as an alternative and valuable method to analyze such antibody selections. The analysis provides a novel and extremely detailed view of selected antibody populations, and allows the identification of specific antibodies using only sequencing data, potentially eliminating the need for expensive and laborious low-throughput screening methods such as enzyme-linked immunosorbant assay. The high cost and the need for bioinformatics experts and powerful computer clusters, however, have limited the general use of deep sequencing in antibody selections. Here, we describe the AbMining ToolBox, an open source software package for the straightforward analysis of antibody libraries sequenced by the three main next generation sequencing platforms (454, Ion Torrent, MiSeq). The ToolBox is able to identify heavy chain CDR3s as effectively as more computationally intense software, and can be easily adapted to analyze other portions of antibody variable genes, as well as the selection outputs of libraries based on different scaffolds. The software runs on all common operating systems (Microsoft Windows, Mac OS X, Linux), on standard personal computers, and sequence analysis of 1-2 million reads can be accomplished in 10-15 min, a fraction of the time of competing software. Use of the ToolBox will allow the average researcher to incorporate deep sequence analysis into routine selections from antibody display libraries. PMID:24423623

  2. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    OpenAIRE

    Chun-Hsiao Wu; Tsai-Yen Li

    2016-01-01

    In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter), the target data sets, and appropriate social sensors for analysis. By adopting paramet...

  3. PRIST: a fourth-generation tool for medical information systems.

    Science.gov (United States)

    Cristiani, P; Larizza, C

    1990-04-01

    PRIST is a fourth-generation software package purposely oriented to development and management of medical applications, running under MS/DOS IBM compatible personal computers. The tool has been developed on the top of DBIII Plus language utilizing the Clipper Compiler networking features for the integration in a LAN environment. Several routines written in C and BASIC Microsoft languages integrated this DBMS-kernel system providing I/O, graphics, statistics, retrieval utilities. To increase the interactivity of the system both menu-driven and windowing interfaces have been implemented. PRIST has been utilized to develop a wide variety of small medical applications ranging from research laboratories to intensive care units. The great majority of reactions from the use of these applications were positive, confirming that PRIST is able to assist in practice management and patient care as well as research purposes. PMID:2345045

  4. Information systems as a tool to improve legal metrology activities

    Science.gov (United States)

    Rodrigues Filho, B. A.; Soratto, A. N. R.; Gonçalves, R. F.

    2016-07-01

    This study explores the importance of information systems applied to legal metrology as a tool to improve the control of measuring instruments used in trade. The information system implanted in Brazil has also helped to understand and appraise the control of the measurements due to the behavior of the errors and deviations of instruments used in trade, allowing the allocation of resources wisely, leading to a more effective planning and control on the legal metrology field. A study case analyzing the fuel sector is carried out in order to show the conformity of fuel dispersers according to maximum permissible errors. The statistics of measurement errors of 167,310 fuel dispensers of gasoline, ethanol and diesel used in the field were analyzed demonstrating the accordance of the fuel market in Brazil to the legal requirements.

  5. Phronesis, a diagnosis and recovery tool for system administrators

    CERN Document Server

    Haen, C; Bonaccorsi, E; Neufeld, N

    2014-01-01

    The LHCb experiment relies on the Online system, which includes a very large and heterogeneous computing cluster. Ensuring the proper behavior of the different tasks running on the more than 2000 servers represents a huge workload for the small operator team and is a 24/7 task. At CHEP 2012, we presented a prototype of a framework that we designed in order to support the experts. The main objective is to provide them with steadily improving diagnosis and recovery solutions in case of misbehavior of a service, without having to modify the original applications. Our framework is based on adapted principles of the Autonomic Computing model, on Reinforcement Learning algorithms, as well as innovative concepts such as Shared Experience. While the submission at CHEP 2012 showed the validity of our prototype on simulations, we here present an implementation with improved algorithms and manipulation tools, and report on the experience gained with running it in the LHCb Online system.

  6. The development of a two-component force dynamometer and tool control system for dynamic machine tool research

    Science.gov (United States)

    Sutherland, I. A.

    1973-01-01

    The development is presented of a tooling system that makes a controlled sinusoidal oscillation simulating a dynamic chip removal condition. It also measures the machining forces in two mutually perpendicular directions without any cross sensitivity.

  7. Safeguards system analysis, (1)

    International Nuclear Information System (INIS)

    A system analysis on the implementing safeguards system based on the traditional materials accountancy was done. This report describes about verification methods applied to operator's measurement data, MUF evaluation method, theories on the decision of PIT frequency and designing of inspection plan. (author)

  8. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  9. Development to integrate conceptual design tools and a CAD system

    Science.gov (United States)

    Torres, V. H.; Ríos, J.; Vizán, A.; Pérez, J. M.

    2012-04-01

    The information supported by PLM/CAD systems is mainly related to Embodiment and Detail Design Phases. Information related to the Conceptual Design Phase is mainly limited to requirement specification documents and system architecture diagram documents. This work aims helping in the integration of the Conceptual Design process and its associated information flow into a commercial software system. It proposes a development framework to integrate Quality Function Deployment, Axiomatic Design, and Failure Mode and Effects Analysis into a PLM/CAD system. This communication presents the methodology used in the development, the software development environment, the modeling of the proposed application and the first results of a pilot implementation.

  10. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  11. Computational Aeroacoustic Analysis System Development

    Science.gov (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  12. Risk analysis of the Vidaa River System

    OpenAIRE

    Vinyals i Patón, Miquel

    2011-01-01

    The main goal of this hydraulic study is the asses of flooding in the Vidaa River system. In order to realize a complete assessment, the main water contributions to the river system have been included in the flood analysis. Three kind of software have been used: • Flooding simulation (MIKE11). • Stochastic weather generator (RainSim). • Extreme value analysis (EVA tool from MIKE11). Mike11 offers the possibility of based upon twenty years observed rainfall data (main head catch...

  13. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  14. A Sonification Tool For The Analysis of Large Databases of Expressive Gesture

    Directory of Open Access Journals (Sweden)

    R. Michael Winters

    2012-10-01

    Full Text Available Expert musical performance is rich with movements that facilitate performance accuracy and expressive communication. Studying these movements quantitatively using high-resolution motion capture systems has been fruitful, but analysis is arduous due to the size of the data sets and performance idiosyncrasies. Compared to visual-only methods, sonification provides an interesting alternative that can ease the process of data analysis and provide additional insights. To this end, a sonification tool was designed in Max/MSP that provides interactive access to synthesis mappings and data preprocessing functions that are specific to expressive movement. The tool is evaluated in terms of its ability to fulfil the goals of sonification in this domain and the goals of expressive movement analysis more generally. Additional benefits of sonification are discussed in light of the expressive and musical context.

  15. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  16. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    International Nuclear Information System (INIS)

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered

  17. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  18. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  19. Tools for advanced simulations to nuclear propulsion systems in rockets

    Energy Technology Data Exchange (ETDEWEB)

    Torres Sepulveda, A.; Perez Vara, R.

    2004-07-01

    While chemical propulsion rockets have dominated space exploration, other forms of rocket propulsion based on nuclear power, electrostatic and magnetic drive, and other principles besides chemical reactions, have been considered from the earliest days of the field. The goal of most of these advanced rocket propulsion schemes is improved efficiency through higher exhaust velocities, in order to reduce the amount of fuel the rocket vehicle needs to carry, though generally at the expense of high thrust. Nuclear propulsion seems to be the most promising short term technology to plan realistic interplanetary missions. The development of a nuclear electric propulsion spacecraft shall require the development of models to analyse the mission and to understand the interaction between the related subsystems (nuclear reactor, electrical converter, power management and distribution, and electric propulsion) during the different phases of the mission. This paper explores the modelling of a nuclear electric propulsion (NEP) spacecraft type using EcosimPro simulation software. This software is a multi-disciplinary simulation tool with a powerful object-oriented simulation language and state-of-the-art solvers. EcosimPro is the recommended ESA simulation tool for environmental Control and Life Support Systems (ECLSS) and has been used successfully within the framework of the European activities of the International Space Station programme. Furthermore, propulsion libraries for chemical and electrical propulsion are currently being developed under ESA contracts to set this tool as standard usage in the propulsion community. At present, there is not any workable NEP spacecraft, but a standardized-modular, multi-purpose interplanetary spacecraft for post-2000 missions, called ISC-2000, has been proposed in reference. The simulation model presented on this paper is based on the preliminary designs for this spacecraft. (Author)

  20. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    Science.gov (United States)

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies.