WorldWideScience

Sample records for program analysis tool

  1. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  2. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  3. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  4. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...

  5. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  6. Program Management Tool

    Science.gov (United States)

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity

  7. A System Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  8. Tools for Ensuring Program Integrity.

    Science.gov (United States)

    Office of Student Financial Assistance (ED), Washington, DC.

    This training document for financial assistance professionals discusses ensuring program integrity in student financial aid and describes some tools for ensuring internal and external program integrity. The training focuses on these tools and resources: (1) the Federal Student Aid (FSA) Schools Portal; (2) the Information for Financial Aid…

  9. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  10. A national analytical quality assurance program: Developing guidelines and analytical tools for the forest inventory and analysis program

    Science.gov (United States)

    Phyllis C. Adams; Glenn A. Christensen

    2012-01-01

    A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a State’s data to the national FIA...

  11. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    Science.gov (United States)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  12. Draper Station Analysis Tool

    Science.gov (United States)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  13. A Ship Collision Analysis Program Based on Upper Bound Solutions and Coupled with a Large Rotational Ship Movement Analysis Tool

    Directory of Open Access Journals (Sweden)

    Hervé Le Sourne

    2012-01-01

    Full Text Available This paper presents a user-friendly rapid prediction tool of damage to struck and striking vessels in a ship collision event. To do this, the so-called upper bound theorem is applied to calculate internal forces and energies of any substructure involved in the ships crushing process. At each increment of indentation, the total crushing force is transmitted to the external dynamics MCOL program, which calculates the global ship motion correction by solving the hydrodynamic force equilibrium equations. As a first step, the paper gives a brief description of the upper bound method originally developed for perpendicular collisions and recently enhanced for oblique ones. Then, the theory developed in MCOL program for large rotational ship movements is detailed. By comparing results obtained with and without MCOL, the importance of hydrodynamic effects is highlighted. Some simulation results are compared with results provided by classical nonlinear finite element calculations. Finally, by using the developed analytical tool, which mixes internal and external dynamics, different crushing scenarios including oblique collisions are investigated and the influence of some collision parameters like longitudinal and vertical impact location, impact angle, and struck ship velocity is studied.

  14. Investment in selective social programs: a proposed methodological tool for the analysis of programs’ sustainability

    Directory of Open Access Journals (Sweden)

    Manuel Antonio Barahona Montero

    2014-08-01

    Full Text Available This paper proposes a methodology to evaluate sustainability of Selective Social Programs (SSP, based on the relationship between economic growth and human development posed by the United Nations Development Program (UNDP.  For such purposes, the Circle of Sustainability is developed, which is comprised of 12 pillars. Each pillar is evaluated based on its current status and impact.  Combining both results allows to assesses sustainability of these programs and identify areas of focus. Therefore, this methodology helps to better channel available efforts and resources.

  15. The Capability Portfolio Analysis Tool (CPAT): A Mixed Integer Linear Programming Formulation for Fleet Modernization Analysis (Version 2.0.2).

    Energy Technology Data Exchange (ETDEWEB)

    Waddell, Lucas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Muldoon, Frank [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Henry, Stephen Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Matthew John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zwerneman, April Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Backlund, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melander, Darryl J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lawton, Craig R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rice, Roy Eugene [Teledyne Brown Engineering, Huntsville, AL (United States)

    2017-09-01

    In order to effectively plan the management and modernization of their large and diverse fleets of vehicles, Program Executive Office Ground Combat Systems (PEO GCS) and Program Executive Office Combat Support and Combat Service Support (PEO CS&CSS) commis- sioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This paper contains a thor- ough documentation of the terminology, parameters, variables, and constraints that comprise the fleet management mixed integer linear programming (MILP) mathematical formulation. This paper, which is an update to the original CPAT formulation document published in 2015 (SAND2015-3487), covers the formulation of important new CPAT features.

  16. CWDPRNP: A tool for cervid prion sequence analysis in program R

    Science.gov (United States)

    Miller, William L.; Walter, William D.

    2017-01-01

    Chronic wasting disease is a fatal, neurological disease caused by an infectious prion protein, which affects economically and ecologically important members of the family Cervidae. Single nucleotide polymorphisms within the prion protein gene have been linked to differential susceptibility to the disease in many species. Wildlife managers are seeking to determine the frequencies of disease-associated alleles and genotypes and delineate spatial genetic patterns. The CWDPRNP package, implemented in program R, provides a unified framework for analyzing prion protein gene variability and spatial structure.

  17. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  18. CWDPRNP: a tool for cervid prion sequence analysis in program R.

    Science.gov (United States)

    Miller, William L; Walter, W David

    2017-10-01

    Chronic wasting disease is a fatal, neurological disease caused by an infectious prion protein, which affects economically and ecologically important members of the family Cervidae. Single nucleotide polymorphisms within the prion protein gene have been linked to differential susceptibility to the disease in many species. Wildlife managers are seeking to determine the frequencies of disease-associated alleles and genotypes and delineate spatial genetic patterns. The CWDPRNP package, implemented in program R, provides a unified framework for analyzing prion protein gene variability and spatial structure. The CWDPRNP package, manual and example data files are available at http://ecosystems.psu.edu/research/labs/walter-lab/additional-labs/population-genetics-lab. This package is available for all commonly used platforms. wlm159psu@gmail.com.

  19. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  20. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  1. The Vicinity of Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    Program documentation plays a vital role in almost all programming processes.  Program documentation flows between separate tools of a modularized environment, and in between the components of an integrated development environment as well.  In this paper we discuss the flow of program documentation...... between program development tools.  In the central part of the paper we introduce a mapping of documentation flow between program development tools.  In addition we discuss a set of locally developed tools which is related to program documentation.  The use of test cases as examples in an interface...... documentation tool is a noteworthy and valuable contribution to the documentation flow.  As an additional contribution we identify several circular relationships which illustrate feedback of documentation to the program editor from other tools in the development environment....

  2. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  3. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  4. Defense Acquisition University Program Managers Tool Kit

    Science.gov (United States)

    2011-01-01

    ORGANIZATION ...................................................................... 55–56 Program Office Organization Structures (Examples...C PM LEGEND: Engr—Engineering Log—Logistics Bus—Business DAU PROGRAM MANAGERS TOOL KIT 56 PROGRAM OFFICE ORGANIZATION STRUCTURES (Continued...the LOG CoP • Life Cycle Logistics Guidebook - Will be posted on the LOG CoP DAU PROGRAM MANAGERS TOOL KIT 55 “Pure” Product Structure PROGRAM OFFICE

  5. Basic Tools: Program Listing Processor.

    Science.gov (United States)

    Pizarro, Antonio

    1988-01-01

    Presents a program that provides a structured listing of BASIC computer programs in a text file. Indents for-next loops for better appearance and easier understanding. Lists program and provides for several versions of BASIC. (MVL)

  6. Graphical Multiprocessing Analysis Tool (GMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.

    1988-03-01

    The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

  7. Stability analysis using SDSA tool

    Science.gov (United States)

    Goetzendorf-Grabowski, Tomasz; Mieszalski, Dawid; Marcinkiewicz, Ewa

    2011-11-01

    The SDSA (Simulation and Dynamic Stability Analysis) application is presented as a tool for analysing the dynamic characteristics of the aircraft just in the conceptual design stage. SDSA is part of the CEASIOM (Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods) software environment which was developed within the SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by the European Commission 6th Framework Program. SDSA can also be used as stand alone software, and integrated with other design and optimisation systems using software wrappers. This paper focuses on the main functionalities of SDSA and presents both computational and free flight experimental results to compare and validate the presented software. Two aircraft are considered, the EADS Ranger 2000 and the Warsaw University designed PW-6 glider. For the two cases considered here the SDSA software is shown to be an excellent tool for predicting dynamic characteristics of an aircraft.

  8. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  9. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  10. Space program management methods and tools

    CERN Document Server

    Spagnulo, Marcello; Balduccini, Mauro; Nasini, Federico

    2013-01-01

    Beginning with the basic elements that differentiate space programs from other management challenges, Space Program Management explains through theory and example of real programs from around the world, the philosophical and technical tools needed to successfully manage large, technically complex space programs both in the government and commercial environment. Chapters address both systems and configuration management, the management of risk, estimation, measurement and control of both funding and the program schedule, and the structure of the aerospace industry worldwide.

  11. Program risk analysis handbook

    Science.gov (United States)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  12. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  13. BEAP: The BLAST Extension and Alignment Program- a tool for contig construction and analysis of preliminary genome sequence

    Directory of Open Access Journals (Sweden)

    Fritz Eric

    2009-01-01

    Full Text Available Abstract Background Fine-mapping projects require a high density of SNP markers and positional candidate gene sequences. In species with incomplete genomic sequence, the DNA sequences needed to generate markers for fine-mapping within a linkage analysis confidence interval may be available but may not have been assembled. To manually piece these sequences together is laborious and costly. Moreover, annotation and assembly of short, incomplete DNA sequences is time consuming and not always straightforward. Findings We have created a tool called BEAP that combines BLAST and CAP3 to retrieve sequences and construct contigs for localized genomic regions in species with unfinished sequence drafts. The rational is that a completed genome can be used as a template to query target genomic sequence for closing the gaps or extending contig sequence length in species whose genome is incomplete on the basis that good homology exists. Each user must define what template sequence is appropriate based on comparative mapping data such as radiation hybrid (RH maps or other evidence linking the gene sequence of the template species to the target species. Conclusion The BEAP software creates contigs suitable for discovery of orthologous genes for positional cloning. The resulting sequence alignments can be viewed graphically with a Java graphical user interface (GUI, allowing users to evaluate contig sequence quality and predict SNPs. We demonstrate the successful use of BEAP to generate genomic template sequence for positional cloning of the Angus dwarfism mutation. The software is available for free online for use on UNIX systems at http://www.animalgenome.org/bioinfo/tools/beap/.

  14. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  15. Building program understanding tools using visitor combinators

    NARCIS (Netherlands)

    A. van Deursen (Arie); J.M.W. Visser (Joost)

    2002-01-01

    textabstractProgram understanding tools manipulate program representations, such as abstract syntax trees, control-flow graphs, or data-flow graphs. This paper deals with the use of visitor combinators to conduct such manipulations. Visitor combinators are an extension of the well-known

  16. Co-authorship Network Analysis: A Powerful Tool for Strategic Planning of Research, Development and Capacity Building Programs on Neglected Diseases

    Science.gov (United States)

    Morel, Carlos Medicis; Serruya, Suzanne Jacob; Penna, Gerson Oliveira; Guimarães, Reinaldo

    2009-01-01

    Background New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. Methodology/Principal Findings Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. Conclusions/Significance Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building objectives and a more

  17. Co-authorship network analysis: a powerful tool for strategic planning of research, development and capacity building programs on neglected diseases.

    Science.gov (United States)

    Morel, Carlos Medicis; Serruya, Suzanne Jacob; Penna, Gerson Oliveira; Guimarães, Reinaldo

    2009-08-18

    New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building objectives and a more consistent engagement of institutions and authors from endemic regions

  18. Co-authorship network analysis: a powerful tool for strategic planning of research, development and capacity building programs on neglected diseases.

    Directory of Open Access Journals (Sweden)

    Carlos Medicis Morel

    Full Text Available BACKGROUND: New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. METHODOLOGY/PRINCIPAL FINDINGS: Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. CONCLUSIONS/SIGNIFICANCE: Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building

  19. Parametric programming of CNC machine tools

    Directory of Open Access Journals (Sweden)

    Gołębski Rafał

    2017-01-01

    Full Text Available The article presents the possibilities of parametric programming of CNC machine tools for the SINUMERIK 840D sl control system. The kinds and types of the definition of variables for the control system under discussion described. On the example of the longitudinal cutting cycle, parametric programming possibilities are shown. The program’s code and its implementation in the control system is described in detail. The principle of parametric programming in a high-level language is also explained.

  20. Sandia Agile MEMS Prototyping, Layout Tools, Education and Services Program

    Energy Technology Data Exchange (ETDEWEB)

    Schriner, H.; Davies, B.; Sniegowski, J.; Rodgers, M.S.; Allen, J.; Shepard, C.

    1998-05-01

    Research and development in the design and manufacture of Microelectromechanical Systems (MEMS) is growing at an enormous rate. Advances in MEMS design tools and fabrication processes at Sandia National Laboratories` Microelectronics Development Laboratory (MDL) have broadened the scope of MEMS applications that can be designed and manufactured for both military and commercial use. As improvements in micromachining fabrication technologies continue to be made, MEMS designs can become more complex, thus opening the door to an even broader set of MEMS applications. In an effort to further research and development in MEMS design, fabrication, and application, Sandia National Laboratories has launched the Sandia Agile MEMS Prototyping, Layout Tools, Education and Services Program or SAMPLES program. The SAMPLES program offers potential partners interested in MEMS the opportunity to prototype an idea and produce hardware that can be used to sell a concept. The SAMPLES program provides education and training on Sandia`s design tools, analysis tools and fabrication process. New designers can participate in the SAMPLES program and design MEMS devices using Sandia`s design and analysis tools. As part of the SAMPLES program, participants` designs are fabricated using Sandia`s 4 level polycrystalline silicon surface micromachine technology fabrication process known as SUMMiT (Sandia Ultra-planar, Multi-level MEMS Technology). Furthermore, SAMPLES participants can also opt to obtain state of the art, post-fabrication services provided at Sandia such as release, packaging, reliability characterization, and failure analysis. This paper discusses the components of the SAMPLES program.

  1. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  2. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    Science.gov (United States)

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. LCA-ship. Design tool for energy efficient ships. A Life Cycle Analysis Program for Ships. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Jiven, Karl; Sjoebris, Anders [MariTerm AB, Goeteborg (Sweden); Nilsson, Maria [Lund Univ. (Sweden). Stiftelsen TEM; Ellis, Joanne; Traegaardh, Peter; Nordstroem, Malin [SSPA Sweden AB, Goeteborg (Sweden)

    2004-05-01

    In order to make it easier to include aspects during ship design that will improve environmental performance, general methods for life cycle calculations and a prototype tool for LCA calculations of ships and marine transportation have been developed. The base of the life cycle analyses is a comprehensive set of life cycle data that was collected for the materials and consumables used in ship construction and vessel operations. The computer tool developed makes it possible to quickly and simply specify (and calculate) the use of consumables over the vessel's life time cycle. Special effort has been made to allow the tool to be used for different types of vessels and sea transport. The main result from the project is the computer tool LCA ship, which incorporates collected and developed life cycle data for some of the most important materials and consumables used in ships and their operation. The computer application also contains a module for propulsion power calculations and a module for defining and optimising the energy system onboard the vessel. The tool itself is described in more detail in the Computer application manual. The input to the application should, as much as possible, be the kind of information that is normally found in a shipping company concerning vessel data and vessel movements. It all starts with defining the ship to be analysed and continues with defining how the ship is used over the lifetime. The tool contains compiled and processed background information about specific materials and processes (LCA data) connected to shipping operations. The LCA data is included in the tool in a processed form. LCA data for steel will for example include the environmental load from the steel production, the process to build the steel structure of the ship, the scrapping and the recycling phase. To be able to calculate the environmental load from the use of steel the total amount of steel used over the life cycle of the ship is also needed. The

  4. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  5. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  6. Do Scaffolding Tools Improve Reflective Writing in Professional Portfolios? A Content Analysis of Reflective Writing in an Advanced Preparation Program

    Science.gov (United States)

    Houston, Cynthia R.

    2016-01-01

    Reflective practice is an important skill that teachers must develop to be able to assess the effectiveness of their teaching and modify their instructional behavior. In many education programs reflective narratives, which are often part of teaching portfolios, are intended to assess students' abilities in these areas. Research on reflectivity in…

  7. Assessment of existing local houses condition as analysis tools for shore housing improvement program in Weriagar district, Bintuni Bay

    Science.gov (United States)

    Firmansyah, F.; Fernando, A.; Allo, I. P. R.

    2018-01-01

    The housing assessment is a part of the pre-feasibility study inThe Shore Housing Improvement Program in Weriagar District, West Papua. The housing assessment was conducted to identify the physical condition of existing houses. The parameters of assessment formulated from local references, practices and also national building regulation that covers each building system components, such as building structure/frame, building floor, building cover, and building roof. This study aims to explains lessons from local practices and references, used as the formula to generate assessment parameter, elaborate with Indonesia building regulation. The result of housing assessment were used as a basis to develop the house improvement strategy, the design alternative for housing improvement and further planning recommendations. The local knowledges involved in housing improvement program expected that the local-based approach could respect to the local build culture, respect the local environment, and the most important can offer best suitable solutions for functional utility and livability.

  8. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  9. Tools for occupant protection analysis

    NARCIS (Netherlands)

    Slaats, P.M.A.; Lee, W.; Babu, V.; Thomson, K.R.

    2001-01-01

    The design of occupant restraint systems in the automotive industry has shifted from an empirical approach to a computer-aided analysis approach for many years now. Various finite element software programs have been applied in crash safety analysis, and multi-body dynamics codes have been

  10. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  11. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  12. The Program Sustainability Assessment Tool: a new instrument for public health programs.

    Science.gov (United States)

    Luke, Douglas A; Calhoun, Annaliese; Robichaux, Christopher B; Elliott, Michael B; Moreland-Russell, Sarah

    2014-01-23

    Public health programs can deliver benefits only if they are able to sustain programs, policies, and activities over time. Although numerous sustainability frameworks and models have been developed, there are almost no assessment tools that have demonstrated reliability or validity or have been widely disseminated. We present the Program Sustainability Assessment Tool (PSAT), a new and reliable instrument for assessing the capacity for program sustainability of various public health and other programs. A measurement development study was conducted to assess the reliability of the PSAT. Program managers and staff (n = 592) representing 252 public health programs used the PSAT to rate the sustainability of their program. State and community-level programs participated, representing 4 types of chronic disease programs: tobacco control, diabetes, obesity prevention, and oral health. The final version of the PSAT contains 40 items, spread across 8 sustainability domains, with 5 items per domain. Confirmatory factor analysis shows good fit of the data with the 8 sustainability domains. The subscales have excellent internal consistency; the average Cronbach's α is 0.88, ranging from 0.79 to 0.92. Preliminary validation analyses suggest that PSAT scores are related to important program and organizational characteristics. The PSAT is a new and reliable assessment instrument that can be used to measure a public health program's capacity for sustainability. The tool is designed to be used by researchers, evaluators, program managers, and staff for large and small public health programs.

  13. Failure environment analysis tool applications

    Science.gov (United States)

    Pack, Ginger L.; Wadsworth, David B.

    1993-02-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  14. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  15. ONTOLOGY-DRIVEN TOOL FOR UTILIZING PROGRAMMING STYLES

    Directory of Open Access Journals (Sweden)

    Nikolay Sidorov

    2017-07-01

    Full Text Available Activities of a programmer will be more effective and the software will be more understandable when within the process of software development, programming styles (standards are used, providing clarity of software texts. Purpose: In this research, we present the tool for the realization of new ontology-based methodology automated reasoning techniques for utilizing programming styles. In particular, we focus on representing programming styles in the form of formal ontologies, and study how description logic reasoner can assist programmers in utilizing programming standards. Our research hypothesis is as follows: ontological representation of programming styles can provide additional benefits over existing approaches in utilizing programmer of programming standards. Our research goal is to develop a tool to support the ontology-based utilizing programming styles. Methods: ontological representation of programming styles; object-oriented programming; ontology-driven utilizing of programming styles. Results: the architecture was obtained and the tool was developed in the Java language, which provide tool support of ontology-driven programming styles application method. On the example of naming of the Java programming language standard, features of implementation and application of the tool are provided. Discussion: application of programming styles in coding of program; lack of automated tools for the processes of programming standards application; tool based on new method of ontology-driven application of programming styles; an example of the implementation of tool architecture for naming rules of the Java language standard.

  16. Matlab programming for numerical analysis

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. Programming MATLAB for Numerical Analysis introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. You will first become

  17. Flow Analysis Tool White Paper

    Science.gov (United States)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  18. Artwork Analysis Tools for VLSI Circuits.

    Science.gov (United States)

    1980-06-01

    derived frcm the art- work.i~nFo :.- Is zr Code DI t pecal Sculnfv CLA a uPICAT OP T0416 PA*6WM Dine Bftee AMA& -’M Artwork Analysis Tools for VLSI Circuits... code of the program and in pre-generated bit tables. The design rules thcmselves are not input directly into the checker. The rules were interpreted...circuit simulation is swich -level sintulation. In this type, transistors are modeled as switches that are either on or off. Fixed delays are a%.ociated

  19. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  20. Cmapanalysis: an extensible concept map analysis tool

    Directory of Open Access Journals (Sweden)

    Alberto J. Cañas

    2013-03-01

    Full Text Available Concept maps are used extensively as an assessment tool, and the literature is abundant with studies on the use of concept maps for assessment and on the assessment of concept maps. The assessment of concept maps can be an arduous process, in particular when assessing a large number of maps. CmapAnalysis is a software tool that facilitates performing various analysis measures on a collection of concept maps. A set of measures that consider size, quality and structure properties of the maps are included. The program is designed to be extensible, allowing users to add their own measures. The program is not intended to replace the individual evaluation of concept maps by teachers and instructors, as it does not capable of “understanding” the content of the maps. It is aimed at researchers who are looking for more general trends and measures across a large number of maps, and who can extend it with their own measures. The output of CmapAnalysis is an Excel spreadsheet that can be further analyzed.

  1. A Simulation Tool for tccp Programs

    Directory of Open Access Journals (Sweden)

    María-del-Mar Gallardo

    2017-01-01

    Full Text Available The Timed Concurrent Constraint Language tccp is a declarative synchronous concurrent language, particularly suitable for modelling reactive systems. In tccp, agents communicate and synchronise through a global constraint store. It supports a notion of discrete time that allows all non-blocked agents to proceed with their execution simultaneously. In this paper, we present a modular architecture for the simulation of tccp programs. The tool comprises three main components. First, a set of basic abstract instructions able to model the tccp agent behaviour, the memory model needed to manage the active agents and the state of the store during the execution. Second, the agent interpreter that executes the instructions of the current agent iteratively and calculates the new agents to be executed at the next time instant. Finally, the constraint solver components which are the modules that deal with constraints. In this paper, we describe the implementation of these components and present an example of a real system modelled in tccp.

  2. Biodiesel Emissions Analysis Program

    Science.gov (United States)

    Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.

  3. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  4. Program plan recognition for year 2000 tools

    NARCIS (Netherlands)

    A. van Deursen (Arie); S. Woods; A. Quilici

    1997-01-01

    textabstractThere are many commercial tools that address various aspects of the Year 2000 problem. None of these tools, however, make any documented use of plan-based techniques for automated concept recovery. This implies a general perception that plan-based techniques is not useful for this

  5. Program Design Report of the CNC Machine Tool(II)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S.; Lee, I. B.; Yoon, K. B.; Lee, C. K.; Youm, J. H

    2007-06-15

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology.

  6. Program Design Report of the CNC Machine Tool (I)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S. (and others)

    2006-08-15

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology.

  7. Program Design Report of the CNC Machine Tool(III)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S.; Lee, I. B.; Yoon, K. B.; Lee, C. K.; Youm, J. H

    2008-08-15

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology.

  8. Program Design Report of the CNC Machine Tool(IV)

    Energy Technology Data Exchange (ETDEWEB)

    Youm, Ki Un; Lee, I. B.; Youm, J. H

    2009-09-15

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology.

  9. General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure, and...

  10. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  11. Analytical Tools for Affordability Analysis

    Science.gov (United States)

    2015-04-30

    able to model uncertainty due to new programs arriving into the portfolio, the problem is even worse. Monte Carlo methods seem like the most...cost growth in a particular program. Future versions of APASS will include Monte Carlo modeling capabilities, to simulate the potential consequences...flunk this basic test from their inception. —Honorable Ashton B. Carter (2010), Under Secretary of Defense for Acquisition, Technology, and Logistics

  12. Multi-mission telecom analysis tool

    Science.gov (United States)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  13. Program Instrumentation and Trace Analysis

    Science.gov (United States)

    Havelund, Klaus; Goldberg, Allen; Filman, Robert; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2002-01-01

    Several attempts have been made recently to apply techniques such as model checking and theorem proving to the analysis of programs. This shall be seen as a current trend to analyze real software systems instead of just their designs. This includes our own effort to develop a model checker for Java, the Java PathFinder 1, one of the very first of its kind in 1998. However, model checking cannot handle very large programs without some kind of abstraction of the program. This paper describes a complementary scalable technique to handle such large programs. Our interest is turned on the observation part of the equation: How much information can be extracted about a program from observing a single execution trace? It is our intention to develop a technology that can be applied automatically and to large full-size applications, with minimal modification to the code. We present a tool, Java PathExplorer (JPaX), for exploring execution traces of Java programs. The tool prioritizes scalability for completeness, and is directed towards detecting errors in programs, not to prove correctness. One core element in JPaX is an instrumentation package that allows to instrument Java byte code files to log various events when executed. The instrumentation is driven by a user provided script that specifies what information to log. Examples of instructions that such a script can contain are: 'report name and arguments of all called methods defined in class C, together with a timestamp'; 'report all updates to all variables'; and 'report all acquisitions and releases of locks'. In more complex instructions one can specify that certain expressions should be evaluated and even that certain code should be executed under various conditions. The instrumentation package can hence be seen as implementing Aspect Oriented Programming for Java in the sense that one can add functionality to a Java program without explicitly changing the code of the original program, but one rather writes an

  14. E-Block: A Tangible Programming Tool with Graphical Blocks

    Directory of Open Access Journals (Sweden)

    Danli Wang

    2013-01-01

    Full Text Available This paper designs a tangible programming tool, E-Block, for children aged 5 to 9 to experience the preliminary understanding of programming by building blocks. With embedded artificial intelligence, the tool defines the programming blocks with the sensors as the input and enables children to write programs to complete the tasks in the computer. The symbol on the programming block's surface is used to help children understanding the function of each block. The sequence information is transferred to computer by microcomputers and then translated into semantic information. The system applies wireless and infrared technologies and provides user with feedbacks on both screen and programming blocks. Preliminary user studies using observation and user interview methods are shown for E-Block's prototype. The test results prove that E-Block is attractive to children and easy to learn and use. The project also highlights potential advantages of using single chip microcomputer (SCM technology to develop tangible programming tools for children.

  15. Mastering C pointers tools for programming power

    CERN Document Server

    Traister, Robert J

    2014-01-01

    If you don't fully understand C pointers and how they are used, you're not getting the most out of C programming. This book features complete coverage on using and controlling C language pointers to make C applications more powerful and expressive. This new edition is completely updated and revised to reflect the changes that have been brought about with the full adoption of ANSI C. All discussions and program examples have been updated, and reading materials necessary for any modern ANSI C programmer have also been added.Includes one 3 1/2"" disk containing all of the working programs and m

  16. Simplified building energy analysis tool for architects

    Science.gov (United States)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools

  17. Environmental Program Management Tools for Federal Facilities

    Science.gov (United States)

    2010-06-17

    Reporting • FEDRPTS reporting tool – Helping Federal Agencies manage their environmental inventories and comply with environmental reporting... managemen t systems ( EMS) at all appropriate organizational levels to ensure : • use of EMS as t he primary management approach for addressing

  18. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    /steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  19. VALIDERING AV VERKTYGET "ENTERPRISE ARCHITECTURE ANALYSIS TOOL"

    OpenAIRE

    Österlind, Magnus

    2011-01-01

    The Enterprise Architecture Analysis Tool, EAAT, is a software tool developed by the department of Industrial Information- and Control systems, ICS, at the Royal Institute of Technology, Stockholm, Sweden. EAAT is a modeling tool that combines Enterprise Architecture (EA) modeling with probabilistic relational modeling. Therefore EAAT makes it possible to design, describe and analyze the organizational structure, business processes, information systems and infrastructure within an enterprise....

  20. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  1. Multi-agent programming languages, tools and applications

    CERN Document Server

    Seghrouchni, Amal El Fallah; Dastani, Mehdi; Bordini, Rafael H

    2009-01-01

    Multi-Agent Systems are a promising technology to develop the next generation open distributed complex software systems. This title presents a number of mature and influential multi-agent programming languages, platforms, development tools and methodologies, and realistic applications.

  2. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  3. 2010 Solar Market Transformation Analysis and Tools

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  4. Software tools to aid Pascal and Ada program design

    Energy Technology Data Exchange (ETDEWEB)

    Jankowitz, H.T.

    1987-01-01

    This thesis describes a software tool which analyses the style and structure of Pascal and Ada programs by ensuring that some minimum design requirements are fulfilled. The tool is used in much the same way as a compiler is used to teach students the syntax of a language, only in this case issues related to the design and structure of the program are of paramount importance. The tool operates by analyzing the design and structure of a syntactically correct program, automatically generating a report detailing changes that need to be made in order to ensure that the program is structurally sound. The author discusses how the model gradually evolved from a plagiarism detection system which extracted several measurable characteristics in a program to a model that analyzed the style of Pascal programs. In order to incorporate more-sophistical concepts like data abstraction, information hiding and data protection, this model was then extended to analyze the composition of Ada programs. The Ada model takes full advantage of facilities offered in the language and by using this tool the standard and quality of written programs is raised whilst the fundamental principles of program design are grasped through a process of self-tuition.

  5. Chemical exchange program analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Waffelaert, Pascale

    2007-09-01

    As part of its EMS, Sandia performs an annual environmental aspects/impacts analysis. The purpose of this analysis is to identify the environmental aspects associated with Sandia's activities, products, and services and the potential environmental impacts associated with those aspects. Division and environmental programs established objectives and targets based on the environmental aspects associated with their operations. In 2007 the most significant aspect identified was Hazardous Materials (Use and Storage). The objective for Hazardous Materials (Use and Storage) was to improve chemical handling, storage, and on-site movement of hazardous materials. One of the targets supporting this objective was to develop an effective chemical exchange program, making a business case for it in FY07, and fully implementing a comprehensive chemical exchange program in FY08. A Chemical Exchange Program (CEP) team was formed to implement this target. The team consists of representatives from the Chemical Information System (CIS), Pollution Prevention (P2), the HWMF, Procurement and the Environmental Management System (EMS). The CEP Team performed benchmarking and conducted a life-cycle analysis of the current management of chemicals at SNL/NM and compared it to Chemical Exchange alternatives. Those alternatives are as follows: (1) Revive the 'Virtual' Chemical Exchange Program; (2) Re-implement a 'Physical' Chemical Exchange Program using a Chemical Information System; and (3) Transition to a Chemical Management Services System. The analysis and benchmarking study shows that the present management of chemicals at SNL/NM is significantly disjointed and a life-cycle or 'Cradle-to-Grave' approach to chemical management is needed. This approach must consider the purchasing and maintenance costs as well as the cost of ultimate disposal of the chemicals and materials. A chemical exchange is needed as a mechanism to re-apply chemicals on site. This

  6. Towards harnessing theories through tool support for hard real-time Java programming

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Frost, Christian; Hansen, Rene Rydhof

    2013-01-01

    We present a rationale for a selection of tools that assist developers of hard real-time applications to verify that programs conform to a Java real-time profile and that platform-specific resource constraints are satisfied. These tools are specialised instances of more generic static analysis an...

  7. Towards harnessing theories through tool support for hard real-time Java programming

    DEFF Research Database (Denmark)

    Søndergaard, Hans; Bøgholm, Thomas; Frost, Christian

    2012-01-01

    We present a rationale for a selection of tools that assist developers of hard real-time applications to verify that programs conform to a Java real-time profile and that platform-specific resource constraints are satisfied. These tools are specialised instances of more generic static analysis an...

  8. CMS AS A WEB PROGRAMMING LEARNING TOOL

    Directory of Open Access Journals (Sweden)

    Zoran T Lovreković

    2015-10-01

    Full Text Available This paper discusses how some of Knowledge Management postulates can be applied in the training of students for Web Programming. CMS is a most vital part of every web applications. Because of that, it is suggested that professor should give students lessons about some as simple as possible CMS in detail, and after that allow them to change, upgrade, and improve this CMS in a several steps, throw their own work and consultations and discussion with teacher and other students. Some of possible requirements for students work are givent in this paper, too.

  9. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  10. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  11. Software Construction and Analysis Tools for Future Space Missions

    Science.gov (United States)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  12. SURE reliability analysis: Program and mathematics

    Science.gov (United States)

    Butler, Ricky W.; White, Allan L.

    1988-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  13. Tool for Validation Software Projects in Programming Labs

    Directory of Open Access Journals (Sweden)

    Antonio J. Sierra

    2012-04-01

    Full Text Available This work shows a testing tool used in Fundamentals of Programming II laboratory in Telecommunication Technologies Engineering Degree at University of Sevilla to check the student project. This tool allows students to test the proper operation of their project in autonomous way. This is a flexible and useful tool for testing the project because the tool identifies when the student has carried out a project that meet the given specifications of the project. This implies a high rate of success when the student delivers its project.

  14. Assessment Tool Development for Extracurricular Smet Programs for Girls

    Science.gov (United States)

    House, Jody; Johnson, Molly; Borthwick, Geoffrey

    Many different programs have been designed to increase girls' interest in and exposure to science, mathematics, engineering, and technology (SMET). Two of these programs are discussed and contrasted in the dimensions of length, level of science content, pedagogical approach, degree of self- vs. parent-selected participants, and amount of communitybuilding content. Two different evaluation tools were used. For one program, a modified version of the University of Pittsburgh's undergraduate engineering attitude assessment survey was used. Program participants' responses were compared to those from a fifth grade, mixed-sex science class. The only gender difference found was in the area of parental encouragement. The girls in the special class were more encouraged to participate in SMET areas. For the second program, a new age-appropriate tool developed specifically for these types of programs was used, and the tool itself was evaluated. The results indicate that the new tool has construct validity. On the basis of these preliminary results, a long-term plan for the continued development of the assessment tool is outlined.

  15. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  16. Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool

    Science.gov (United States)

    Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.

    2011-01-01

    This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…

  17. Chemometric Tools in Environmental Data Analysis

    Science.gov (United States)

    Reczynski, Witold; Jakubowska, Malgorzata; Bas, Boguslaw; Niewiara, Ewa; Kubiak, Władysław W.

    2007-12-01

    Chemometric analysis of the Dobczyce Reservoir sediments is presented herein. The sediments have been sampled at 17 points, dried, digested and quantitatively analyzed for 12 elements, covering a three-year period (2004-2006). Substantial variations in composition due to the hydrological and environmental reasons were found. The use of mathematical and statistical tools enables objective and effective analysis of obtained data.

  18. SNAP - Program for Symbolic Analysis

    Directory of Open Access Journals (Sweden)

    Z. Kolka

    1999-04-01

    Full Text Available The paper deals with a program SNAP for symbolic analysis of linear circuits in frequency domain. The program is suitable for analysis of circuits with ideal network elements to explore basic principles of their operation. Besides graphical presentation the analysis results can be exported to popular mathematical programs for further processing. Currently, an algorithm for exact symbolic analysis is implemented. Therefore the program is suitable for relatively small circuits.

  19. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  20. Planetary Protection Bioburden Analysis Program

    Science.gov (United States)

    Beaudet, Robert A.

    2013-01-01

    This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous tools that report the data in various ways to simplify the reports required. The program performs all the calculations directly in the MS Access program. Prior to this development, the data was exported to large Excel files that had to be cut and pasted to provide the desired results. The program contains a main menu and a number of submenus. Analyses can be performed by using either all the assays, or only the accountable assays that will be used in the final analysis. There are three options on the first menu: either calculate using (1) the old MER (Mars Exploration Rover) statistics, (2) the MSL statistics for all the assays, or This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software

  1. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  2. Strengthening Chronic Disease Prevention Programming: the Toward Evidence-Informed Practice (TEIP) Program Assessment Tool

    Science.gov (United States)

    Albert, Dayna; Fortin, Rebecca; Lessio, Anne; Herrera, Christine; Hanning, Rhona; Rush, Brian

    2013-01-01

    Best practices identified solely on the strength of research evidence may not be entirely relevant or practical for use in community-based public health and the practice of chronic disease prevention. Aiming to bridge the gap between best practices literature and local knowledge and expertise, the Ontario Public Health Association, through the Toward Evidence-Informed Practice initiative, developed a set of resources to strengthen evidence-informed decision making in chronic disease prevention programs. A Program Assessment Tool, described in this article, emphasizes better processes by incorporating review criteria into the program planning and implementation process. In a companion paper, “Strengthening Chronic Disease Prevention Programming: The Toward Evidence-Informed Practice (TEIP) Program Evidence Tool,” we describe another tool, which emphasizes better evidence by providing guidelines and worksheets to identify, synthesize, and incorporate evidence from a range of sources (eg, peer-reviewed literature, gray literature, local expertise) to strengthen local programs. The Program Assessment Tool uses 19 criteria derived from literature on best and promising practices to assess and strengthen program planning and implementation. We describe the benefits, strengths, and challenges in implementing the tool in 22 community-based chronic disease prevention projects in Ontario, Canada. The Program Assessment Tool helps put best processes into operation to complement adoption and adaptation of evidence-informed practices for chronic disease prevention. PMID:23721789

  3. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  4. Using shell tools in Mesolithic and early Neolithic coastal sites from Northern Spain: experimental program for use wear analysis in malacological materials

    Directory of Open Access Journals (Sweden)

    Cuenca Solana, David

    2010-06-01

    Full Text Available One of the most common debates surrounding the Mesolithic and early Neolithic periods in northern Spain focuses on the scarcity of lithic and osseous technologies identified in large shell midden contexts. Currently, several hypotheses have been proposed that attribute this phenomenon to differences in site spatial organization, increases in perishable material use, or changes in subsistence strategies. However, recently shell tools have been identified in the early Neolithic levels at Santimamiñe cave located in the Basque Country of northern Spain. These artifacts are the first evidence of shell tools to be identified in Northern Spain in an early Neolithic shell midden context. This paper proposes the hypothesis that shell tools were being used in subsistence activities. To test this hypothesis, the authors developed an experimental programme using different types of mollusc shells to examine evidence of functional use on wood, dry/fresh animal skin and non-woody plants. The experimental results were then used to examine the patterns of use on the seven shell tools from Santimamiñe. The results of the comparisons indicate that the seven shell tools have similar use patterns as the experimental shells. This evidence supports the proposed hypothesis that shell tools may have been used frequently in shell midden contexts during the Mesolithic and early Neolithic for the working of wood, plants or animal skin.

    Uno de los debates más extendidos en la historiografía sobre el Mesolítico y el Neolítico inicial en la región cantábrica es el de la escasez de tecnologías “tradicionales” en la mayor parte de los contextos existentes, especialmente en aquellos con grandes acumulaciones de conchas. Actualmente, varias de las hipótesis propuestas atribuyen este fenómeno a diferencias en la organización espacial de los asentamientos, al aumento en la utilización de materiales perecederos o a cambios en las estrategias de subsistencia

  5. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  6. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  7. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  8. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    Science.gov (United States)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  9. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  10. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  11. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  12. Algorithmic Bricks: A Tangible Robot Programming Tool for Elementary School Students

    Science.gov (United States)

    Kwon, D.-Y.; Kim, H.-S.; Shim, J.-K.; Lee, W.-G.

    2012-01-01

    Tangible programming tools enable children to easily learn the programming process, previously considered to be difficult for them. While various tangible programming tools have been developed, there is still a lack of available tools to help students experience the general programming process. This study therefore developed a tool called…

  13. Drawing tool recognition by stroke ending analysis

    Science.gov (United States)

    Vill, Maria C.; Sablatnig, Robert

    2008-02-01

    The aim of our work is the development of image analysis tools and methods for the investigation of drawings and drawn drafts in order to investigate the authorship, to identify copies or more general to allow for a comparison of different types of drawings. It was and is common for artists to draw their design as several drafts on paper. These drawings can show how some elements were adjusted until the artist was satisfied with the composition. Therefore it can bring insights into the practice of artists and painting and/or drawing schools. This information is useful for art historians, because it can relate artists to each other. The goal of this paper is to describe a stroke classification algorithm which can recognize the drawing tool based on the shape of the endings of an open stroke. In this context, "open" means that both endings of a stroke are free-standing, uncovered and do not pass into another stroke. These endings are prominent features whose shape carries information about the drawing tool and are therefore used as features to distinguish different drawing tools. Our results show that it is possible to use these endings as input a drawing tool classificator.

  14. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    Science.gov (United States)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  15. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  16. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... and the value of associated rewards in states of interest for a real-world example from a case company in the Danish baked goods industry. The developments are presented in a generalised fashion to make them relevant to the general problem of implementing quantitative probabilistic model checking of graph...

  17. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  18. Integrating New Technologies and Existing Tools to Promote Programming Learning

    Directory of Open Access Journals (Sweden)

    Álvaro Santos

    2010-04-01

    Full Text Available In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.

  19. DYNAMIC PROGRAMMING – EFFICIENT TOOL FOR POWER SYSTEM EXPANSION PLANNING

    Directory of Open Access Journals (Sweden)

    SIMO A.

    2015-03-01

    Full Text Available The paper isfocusing on dynamic programming use for power system expansion planning (EP – transmission network (TNEP and distribution network (DNEP. The EP problem has been approached from the retrospective and prospective point of view. To achieve this goal, the authors are developing two software-tools in Matlab environment. Two techniques have been tackled: particle swarm optimization (PSO and genetic algorithms (GA. The case study refers to Test 25 buses test power system developed within the Power Systems Department.

  20. Designing a tool for curriculum leadership development in postgraduate programs

    Directory of Open Access Journals (Sweden)

    M Avizhgan

    2016-07-01

    Full Text Available Introduction: Leadership in the area of curriculum development is increasingly important as we look for ways to improve our programmes and practices. In curriculum studies, leadership has received little attention. Considering the lack of an evaluation tool with objective criteria in postgraduate curriculum leadership process, this study aimed to design a specific tool and determine the validity and reliability of the tool. Method: This study is a methodological research.  At first, domains and items of the tool were determined through expert interviews and literature review. Then, using Delphi technique, 54 important criteria were developed. A panel of experts was used to confirm content and face validity. Reliability was determined by a descriptive study in which 30 faculties from two of Isfahan universities and was estimated by internal consistency. The data were analyzed by SPSS software, using Pearson Correlation Coefficient and reliability analysis. Results: At first, considering the definition of curriculum leadership determined the domains and items of the tool and they were developed primary tool. Expert’s faculties’ views were used in deferent stages of development and psychometry. The tool internal consistency with Cronbach's alpha coefficient times was 96.5. This was determined for each domain separately. Conclution: Applying this instrument can improve the effectiveness of curriculum leadership. Identifying the characteristics of successful and effective leaders, and utilizing this knowledge in developing and implementing curriculum might help us to have better respond to the changing needs of our students, teachers and schools of tomorrow.

  1. Strengthening Chronic Disease Prevention Programming: The Toward Evidence-Informed Practice (TEIP) Program Evidence Tool

    Science.gov (United States)

    Albert, Dayna; Fortin, Rebecca; Herrera, Christine; Hanning, Rhona; Lessio, Anne; Rush, Brian

    2013-01-01

    In public health and chronic disease prevention there is increasing priority for effective use of evidence in practice. In Ontario, Canada, despite various models being advanced, public health practitioners are seeking ways to identify and apply evidence in their work in practical and meaningful ways. In a companion article, “Strengthening Chronic Disease Prevention Programming: The Toward Evidence-Informed Practice (TEIP) Program Assessment Tool,” we describe use of a tool to assess and strengthen program planning and implementation processes using 19 criteria derived from best and promising practices literature. In this article, we describe use of a complementary Program Evidence Tool to identify, synthesize, and apply a range of evidence sources to strengthen the content of chronic disease prevention programming. The Program Evidence Tool adapts tools of evidence-based medicine to the unique contexts of community-based health promotion and chronic disease prevention. Knowledge management tools and a guided dialogue process known as an Evidence Forum enable community stakeholders to make appropriate use of evidence in diverse social, political, and structural contexts. Practical guidelines and worksheets direct users through 5 steps: 1) define an evidence question, 2) develop a search strategy, 3) collect and synthesize evidence, 4) interpret and adapt evidence, and 5) implement and evaluate. We describe the Program Evidence Tool’s benefits, strengths, challenges, and what was learned from its application in 4 Ontario public health departments. The Program Evidence Tool contributes to the development and understanding of the complex use of evidence in community-based chronic disease prevention. PMID:23721788

  2. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  3. Development of numerical modelling of analysis program for energy ...

    Indian Academy of Sciences (India)

    Taiwan. 2Department of Landscape Architecture, National Chin-Yi University of. Technology .... a design tool by engineers for the structural design of buildings, bridges, factories, industrial and public works, and .... cal analysis tool, the dynamic analysis program needs to be developed for energy design in accordance with ...

  4. TESMA: Requirements and Design of a Tool for Educational Programs

    Directory of Open Access Journals (Sweden)

    Nicolas Guelfi

    2017-03-01

    Full Text Available Defining and managing teaching programs at universities or other institutions is a complex task for which there is not much support in terms of methods and tools. This task becomes even more critical when the time comes to obtain certifications w.r.t. official standards. In this paper, we present an on-going project called TESMA, whose objective is to provide an open-source tool dedicated to the specification and management (including certification of teaching programs. An in-depth market analysis regarding related tools and conceptual frameworks of the project is presented. This tool has been engineered using a development method called Messir for its requirements elicitation and introduces a domain-specific language dedicated to the teaching domain. This paper presents the current status of this project and the future activities planned.

  5. Owning the program technical baseline for future space systems acquisition: program technical baseline tracking tool

    Science.gov (United States)

    Nguyen, Tien M.; Guillen, Andy T.; Hant, James J.; Kizer, Justin R.; Min, Inki A.; Siedlak, Dennis J. L.; Yoh, James

    2017-05-01

    The U.S. Air Force (USAF) has recognized the needs for owning the program and technical knowledge within the Air Force concerning the systems being acquired to ensure success. This paper extends the previous work done by the authors [1-2] on the "Resilient Program Technical Baseline Framework for Future Space Systems" and "Portfolio Decision Support Tool (PDST)" to the development and implementation of the Program and Technical Baseline (PTB) Tracking Tool (PTBTL) for the DOD acquisition life cycle. The paper describes the "simplified" PTB tracking model with a focus on the preaward phases and discusses how to implement this model in PDST.

  6. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  7. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  8. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  9. Maternal influenza immunization in Malawi: Piloting a maternal influenza immunization program costing tool by examining a prospective program.

    Directory of Open Access Journals (Sweden)

    Clint Pecenka

    Full Text Available This costing study in Malawi is a first evaluation of a Maternal Influenza Immunization Program Costing Tool (Costing Tool for maternal immunization. The tool was designed to help low- and middle-income countries plan for maternal influenza immunization programs that differ from infant vaccination programs because of differences in the target population and potential differences in delivery strategy or venue.This analysis examines the incremental costs of a prospective seasonal maternal influenza immunization program that is added to a successful routine childhood immunization and antenatal care program. The Costing Tool estimates financial and economic costs for different vaccine delivery scenarios for each of the major components of the expanded immunization program.In our base scenario, which specifies a donated single dose pre-filled vaccine formulation, the total financial cost of a program that would reach 2.3 million women is approximately $1.2 million over five years. The economic cost of the program, including the donated vaccine, is $10.4 million over the same period. The financial and economic costs per immunized pregnancy are $0.52 and $4.58, respectively. Other scenarios examine lower vaccine uptake, reaching 1.2 million women, and a vaccine purchased at $2.80 per dose with an alternative presentation.This study estimates the financial and economic costs associated with a prospective maternal influenza immunization program in a low-income country. In some scenarios, the incremental delivery cost of a maternal influenza immunization program may be as low as some estimates of childhood vaccination programs, assuming the routine childhood immunization and antenatal care systems are capable of serving as the platform for an additional vaccination program. However, purchasing influenza vaccines at the prices assumed in this analysis, instead of having them donated, is likely to be challenging for lower-income countries. This result

  10. Field Assessment of Energy Audit Tools for Retrofit Programs

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, J.; Bohac, D.; Nelson, C.; Smith, I.

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home's asset performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Home rating systems can help motivate homeowners in several ways. Ratings can clearly communicate a home's achievable energy efficiency potential, provide a quantitative assessment of energy savings after retrofits are completed, and show homeowners how they rate compared to their neighbors, thus creating an incentive to conform to a social standard. An important consideration is how rating tools for the retrofit market will integrate with existing home energy service programs. For residential programs that target energy savings only, home visits should be focused on key efficiency measures for that home. In order to gain wide adoption, a rating tool must be easily integrated into the field process, demonstrate consistency and reasonable accuracy to earn the trust of home energy technicians, and have a low monetary cost and time hurdle for homeowners. Along with the Home Energy Score, this project also evaluated the energy modeling performance of SIMPLE and REM/Rate.

  11. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ``Energy Efficiency, Developing Countries, and Eastern Europe,`` part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program`s researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  12. Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)

    Energy Technology Data Exchange (ETDEWEB)

    Hogden, John Edward [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Unal, Cetin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-06

    A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.

  13. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  14. GAP Analysis Program (GAP)

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...

  15. Medicare Part D Program Analysis

    Data.gov (United States)

    U.S. Department of Health & Human Services — This page contains information on Part D program analysis performed by CMS. These reports will also be used to better identify, evaluate and measure the effects of...

  16. Social network analysis for program implementation.

    Science.gov (United States)

    Valente, Thomas W; Palinkas, Lawrence A; Czaja, Sara; Chu, Kar-Hai; Brown, C Hendricks

    2015-01-01

    This paper introduces the use of social network analysis theory and tools for implementation research. The social network perspective is useful for understanding, monitoring, influencing, or evaluating the implementation process when programs, policies, practices, or principles are designed and scaled up or adapted to different settings. We briefly describe common barriers to implementation success and relate them to the social networks of implementation stakeholders. We introduce a few simple measures commonly used in social network analysis and discuss how these measures can be used in program implementation. Using the four stage model of program implementation (exploration, adoption, implementation, and sustainment) proposed by Aarons and colleagues [1] and our experience in developing multi-sector partnerships involving community leaders, organizations, practitioners, and researchers, we show how network measures can be used at each stage to monitor, intervene, and improve the implementation process. Examples are provided to illustrate these concepts. We conclude with expected benefits and challenges associated with this approach.

  17. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  18. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  19. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael I.

    2011-01-01

    XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...... of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey the use of XML graphs for program analysis with four very different languages: XACT (XML in Java), Java Servlets (Web application programming), XSugar...... (transformations between XML and non-XML data), and XSLT (stylesheets for transforming XML documents)....

  20. Using Organizational Assessment as a Tool for Program Change

    Science.gov (United States)

    Courtney, Katherine Ortega; Joe, George W.; Rowan-Szal, Grace A.; Simpson, D. Dwayne

    2007-01-01

    Organizational functioning within substance abuse treatment organizations is important to the transfer of research innovations into practice. Programs should be performing well for new interventions to be implemented successfully. The present study examined characteristics of treatment programs that participated in an assessment and training workshop designed to improve organizational functioning. The workshop was attended by directors and clinical supervisors from 53 community-based treatment units in a single state in the Southwest. Logistic regression analysis was used to examine attributes related to program-level decisions to engage in a structured process for making organizational changes. Findings showed that programs with higher needs and pressures, and those with more limited institutional resources, and poorer ratings of staff attributes and organizational climate were most likely to engage in a change strategy. Furthermore, organizations with greater staff consensus (i.e., smaller standard deviations) on ratings of organizational climate also were more likely to engage in change. PMID:17433861

  1. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  2. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  3. Wavelet bicoherence: A new turbulence analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    van Milligen, B.P.; Sanchez, E.; Estrada, T.; Hidalgo, C.; Branas, B. [Asociacion EURATOM-CIEMAT, Madrid (Spain); Carreras, B. [Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States); Garcia, L. [Universidad Carlos III, Madrid (Spain)

    1995-08-01

    A recently introduced tool for the analysis of turbulence, wavelet bicoherence [van Milligen, Hidalgo, and Sanchez, Phys. Rev. Lett. {bold 16}, 395 (1995)], is investigated. It is capable of detecting phase coupling---nonlinear interactions of the lowest (quadratic) order---with time resolution. To demonstrate its potential, it is applied to numerical models of chaos and turbulence and to real measurements. It detected the coupling interaction between two coupled van der Pol oscillators. When applied to a model of drift wave turbulence relevant to plasma physics, it detected a highly localized coherent structure. Analyzing reflectometry measurements made in fusion plasmas, it detected temporal intermittency and a strong increase in nonlinear phase coupling coinciding with the L/H (low-to-high confinement mode) transition. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}.

  4. Using the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  5. Goal Programming: A New Tool for the Christmas Tree Industry

    Science.gov (United States)

    Bruce G. Hansen

    1977-01-01

    Goal programing (GP) can be useful for decision making in the natural Christmas tree industry. Its usefulness is demonstrated through an analysis of a hypothetical problem in which two potential growers decide how to use 10 acres in growing Christmas trees. Though the physical settings are identical, distinct differences between their goals significantly influence the...

  6. INSTRUMENTAL TOOLS FOR PROGRAM CODE DEVELOPMENT WRITTEN IN HIGH LEVEL PROGRAMMING LANGUAGE.

    Directory of Open Access Journals (Sweden)

    E.A. Alferov

    2010-11-01

    Full Text Available The paper presents the environment of demonstration of integrated environment for studying course «Basics of algorithmization and programming» (http://weboap.ksu.ks.ua, which allows execution of computational experiment to study the complexity and majorizability of sorting algorithms. We describe the design and development of new version of the application. Much attention is paid to the development component of the code editor, which will meet the current requirements of tools to write programs.

  7. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  8. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  9. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  10. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  11. Program Analysis as Model Checking

    DEFF Research Database (Denmark)

    Olesen, Mads Chr.

    Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...... and sustenance of life. Due to the complexity inherent in the software it can be very difficult for the software developer to guarantee the absence of errors; automated support in the form of automated program analysis is therefore essential. Two methods have traditionally been proposed: model checking...... and abstract interpretation. Model checking views the program as a finite automaton and tries to prove logical properties over the automaton model, or present a counter-example if not possible — with a focus on precision. Abstract interpretation translates the program semantics into abstract semantics...

  12. The LTS timing analysis program :

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Darrell Jewell; Schwarz, Jens

    2013-08-01

    The LTS Timing Analysis program described in this report uses signals from the Tempest Lasers, Pulse Forming Lines, and Laser Spark Detectors to carry out calculations to quantify and monitor the performance of the the Z-Accelerators laser triggered SF6 switches. The program analyzes Z-shots beginning with Z2457, when Laser Spark Detector data became available for all lines.

  13. VECT: an automatic visual Perl programming tool for nonprogrammers.

    Science.gov (United States)

    Chou, Hui-Hsien

    2005-04-01

    Modern high-throughput biological research produces enormous amount of data that must be processed by computers, but many biologists dealing with these data are not professional programmers. Despite increased awareness of interdisciplinary training in bioinformatics, many biologists still find it difficult to create their own computational solutions. VECT, the Visual Extraction and Conversion Tool, has been developed to assist nonprogrammers to create simple bioinformatics without having to master a programming language. VECT provides a unified graphical user interface for data extraction, data conversion, output composition, and Perl code generation. Programming using VECT is achieved by visually performing the desired data extraction, conversion, and output composition tasks using some sample user data. These tasks are then compiled by VECT into an executable Perl program, which can be saved for later use and can carry out the same computation independently of VECT. VECT is released under the GNU General Public License and is freely available for all major computing platforms including Macintosh OS X, Linux, and Microsoft Windows at www.complex.iastate.edu.

  14. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  15. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  16. R data analysis without programming

    CERN Document Server

    Gerbing, David W

    2013-01-01

    This book prepares readers to analyze data and interpret statistical results using R more quickly than other texts. R is a challenging program to learn because code must be created to get started. To alleviate that challenge, Professor Gerbing developed lessR. LessR extensions remove the need to program. By introducing R through less R, readers learn how to organize data for analysis, read the data into R, and produce output without performing numerous functions and programming exercises first. With lessR, readers can select the necessary procedure and change the relevant variables without pro

  17. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  18. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  19. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  20. SPPTOOLS: Programming tools for the IRAF SPP language

    Science.gov (United States)

    Fitzpatrick, M.

    1992-01-01

    An IRAF package to assist in SPP code development and debugging is described. SPP is the machine-independent programming language used by virtually all IRAF tasks. Tools have been written to aide both novice and advanced SPP programmers with development and debugging by providing tasks to check the code for the number and type of arguments in all calls to IRAF VOS library procedures, list the calling sequences of IRAF tasks, create a database of identifiers for quick access, check for memory which is not freed, and a source code formatter. Debugging is simplified since the programmer is able to get a better understanding of the structure of his/her code, and IRAF library procedure calls (probably the most common source of errors) are automatically checked for correctness.

  1. Field Assessment of Energy Audit Tools for Retrofit Programs

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, J. [Univ. of Minnesota, St. Paul, MN (United States); Bohac, D. [Univ. of Minnesota, St. Paul, MN (United States); Nelson, C. [Univ. of Minnesota, St. Paul, MN (United States); Smith, I. [Univ. of Minnesota, St. Paul, MN (United States)

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home’s energy performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Rating systems based on energy performance models, the focus of this report, can establish a home’s achievable energy efficiency potential and provide a quantitative assessment of energy savings after retrofits are completed, although their accuracy needs to be verified by actual measurement or billing data. Ratings can also show homeowners where they stand compared to their neighbors, thus creating social pressure to conform to or surpass others. This project field-tested three different building performance models of varying complexity, in order to assess their value as rating systems in the context of a residential retrofit program: Home Energy Score, SIMPLE, and REM/Rate.

  2. PyRAT - python radiography analysis tool (u)

    Energy Technology Data Exchange (ETDEWEB)

    Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory; Armstrong, Jerawan C [Los Alamos National Laboratory

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  3. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  4. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  5. Tools for integrated sequence-structure analysis with UCSF Chimera.

    Science.gov (United States)

    Meng, Elaine C; Pettersen, Eric F; Couch, Gregory S; Huang, Conrad C; Ferrin, Thomas E

    2006-07-12

    Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a) provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b) facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit); (c) can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d) interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is available for Microsoft Windows, Apple Mac OS X

  6. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  7. Online learning tools in an M.Ed. in Earth Sciences program

    Science.gov (United States)

    Richardson, E.

    2011-12-01

    Penn State's Master of Education in Earth Sciences program is a fully online 30-credit degree program serving mid-career secondary science teachers. Teachers in the program have a diverse background in science and math, are usually many years removed from their most recent degree, and are often deficient in the same geoscience skills as are beginning undergraduates. For example, they habitually assign incorrect causal relationships to concepts that are taught at the same time (such as sea-floor spreading and magnetic field reversals), and they have trouble with both object and spatial visualization. Program faculty also observe anecdotally that many teachers enter the program lacking the ability to describe their mental model of a given Earth science process, making it difficult to identify teachers' knowledge gaps. We have implemented many technical strategies to enhance program content delivery while trying to minimize the inherent barriers to completing quantitative assignments online and at a distance. These barriers include competence with and access to sophisticated data analysis and plotting programs commonly used by scientists. Here, I demonstrate two technical tools I use frequently to strengthen online content delivery and assessment. The first, Jing, is commercially-available, free, and platform-independent. Jing allows the user to make screencasts with narration and embed them into a web page as a flash movie or as an external link. The second is a set of simple sketching tools I have created using the programming language Processing, which is a free, open source, platform-independent language built on Java. The integration of easy-to-use drawing tools into problem sets and other assessments has enabled faculty to appraise a learner's grasp of the material without the steep technical learning curve and expense inherent in most computer graphics packages. A serendipitous benefit of teaching with these tools is that they are easy to learn and freely

  8. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  9. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  10. IMPROVING EXPERIMENT DESIGN SKILLS: USING THE JOKO TINGKIR PROGRAM AS A LEARNING TOOL OF TSUNAMI TOPIC

    Directory of Open Access Journals (Sweden)

    Madlazim

    2014-07-01

    Full Text Available Students are rarely given an opportunity to think deeply about experimental design or asked to develop experimental skills on their own. Without participating in these endeavors, they are often unaware of the many decisions necessary to construct a precise methodology. This article describes the Joko Tingkir Program as an Early Warning Tsunami, and how we have used this program as a learning tool for physics teacher candidates to improve their experimental design skills. The Joko Tingkir computer program has implemented a Tsunami Faulting Model (TFM. The TFM uses the principle that the tsunami is affected by the length and width of earthquake rupture. Both can be represented by the duration of rupture (Tdur or Exceed 50 second duration (T50Ex and the dominant period (Td. The TFM has been implemented by the Joko Tingkir computer program. When students are given a simple method using the Joko Tingkir program - such as the tutorial, observation of seismic station distribution, seismograms of the earthquake, equipment and software for this experiment, measurement of P time onset and determination of Tdur, Td and T50Ex - it allows them to focus exclusively on improving experiment design skills as indicated by significantly improved gain scores. Based on the gain analysis it can be inferred that the experiment design skills can be improved by implementation of Joko Tingkir Program as a Learning Tool of Tsunami Warning in the learning process

  11. The DataTools Professional Development Program: Sustainability via a University Partnership

    Science.gov (United States)

    Haddad, N.; Ledley, T. S.; McAuliffe, C. A.; Reider, D.

    2009-12-01

    The DataTools professional development program (http://serc.carleton.edu/eet/msdatatools), offered by TERC, helps teachers integrate technology, scientific data, and inquiry into their middle and high school curricula. It leverages the resources and techniques of the Earth Exploration Toolbook (http://serc.carleton.edu/eet), an online collection of investigations that promotes the use of technology and scientific data in the context of studying the earth system. Over the course of the year-long program, teachers develop skills and a pedagogy of inquiry through a combination of on-line and face-to-face professional development and a significant amount of peer support. They learn to use information technologies that support the visualization and analysis of numerical, geospatial, and image data. DataTools was funded by NSF’s ITEST program to operate for three years. During year two we started to investigate the possibility of transforming the program into a graduate-level course at the University of Massachusetts, Dartmouth (UMD). The first step in that process was partnering with UMD to offer the third year of the NSF-funded program as a 3-credit graduate course on a 1-year trial basis. Our UMD partner participated in advertising the program to teachers in its network, provided classroom space at UMD for the face-to-face meetings and summer workshop, and offered three graduate credits to teachers who successfully completed the program. TERC staff continued to provide the professional development. The formation of the School for Education, Public Policy, and Civic Engagement at UMD, and the new STEM Department within that school appear to be favoring the transformation of this NSF-funded program into a sustainable graduate level course for in-service teachers. A key element to developing a sustainable course at a large university is to position it in a way that can service the largest number of students. In addition to the tremendous need of science professional

  12. HANSIS software tool for the automated analysis of HOLZ lines

    Energy Technology Data Exchange (ETDEWEB)

    Holec, D., E-mail: david.holec@unileoben.ac.at [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom); Sridhara Rao, D.V.; Humphreys, C.J. [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom)

    2009-06-15

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  13. chemalot and chemalot_knime: Command line programs as workflow tools for drug discovery.

    Science.gov (United States)

    Lee, Man-Ling; Aliagas, Ignacio; Feng, Jianwen A; Gabriel, Thomas; O'Donnell, T J; Sellers, Benjamin D; Wiswedel, Bernd; Gobbi, Alberto

    2017-06-12

    Analyzing files containing chemical information is at the core of cheminformatics. Each analysis may require a unique workflow. This paper describes the chemalot and chemalot_knime open source packages. Chemalot is a set of command line programs with a wide range of functionalities for cheminformatics. The chemalot_knime package allows command line programs that read and write SD files from stdin and to stdout to be wrapped into KNIME nodes. The combination of chemalot and chemalot_knime not only facilitates the compilation and maintenance of sequences of command line programs but also allows KNIME workflows to take advantage of the compute power of a LINUX cluster. Use of the command line programs is demonstrated in three different workflow examples: (1) A workflow to create a data file with project-relevant data for structure-activity or property analysis and other type of investigations, (2) The creation of a quantitative structure-property-relationship model using the command line programs via KNIME nodes, and (3) The analysis of strain energy in small molecule ligand conformations from the Protein Data Bank database. The chemalot and chemalot_knime packages provide lightweight and powerful tools for many tasks in cheminformatics. They are easily integrated with other open source and commercial command line tools and can be combined to build new and even more powerful tools. The chemalot_knime package facilitates the generation and maintenance of user-defined command line workflows, taking advantage of the graphical design capabilities in KNIME. Graphical abstract Example KNIME workflow with chemalot nodes and the corresponding command line pipe.

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  16. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  17. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  18. Buffer$--An Economic Analysis Tool

    Science.gov (United States)

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  19. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  20. aRMSD: A Comprehensive Tool for Structural Analysis.

    Science.gov (United States)

    Wagner, Arne; Himmel, Hans-Jörg

    2017-03-27

    A new free tool for structural comparison is presented that combines existing and new features into a single software package. aRMSD incorporates the functions of establishing a pairwise correlation between the atoms of two molecular structures and the calculation of the optimal rotation matrix that minimizes the root-mean-square deviation (RMSD) between the molecules. The complexity of the Hungarian assignation problem is reduced by decomposing molecules into different subsets based on different atom or group types allowing for an efficient and robust treatment of large molecules while tolerating different substituents. Various weighting functions can be used for the calculation of RMSD values and similarity descriptors, and the utilization of coordinate uncertainties allows for the calculation of standard deviations for all calculated properties through error propagation. A new three-dimensional (3D) graphical representation that combines multiple aspects of structural information is presented which is useful in the analysis of structural similarity and diversity. The capabilities of aRMSD are demonstrated by selected examples that show how the program can be utilized in the analysis of structural changes and in the correlation of structure and activity in molecules. The source code of the program can be downloaded at https://github.com/armsd/aRMSD .

  1. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  2. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  3. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  4. PLAGIARISM DETECTION PROBLEMS AND ANALYSIS SOFTWARE TOOLS FOR ITS SOLVE

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2017-02-01

    Full Text Available Purpose. This study is aimed at: 1 the definition of plagiarism in texts on formal and natural languages, building a taxonomy of plagiarism; 2 identify major problems of plagiarism detection when using automated tools to solve them; 3 Analysis and systematization of information obtained during the review, testing and analysis of existing detection systems. Methodology. To identify the requirements of the software to detect plagiarism apply methods of analysis of normative documentation (legislative base and competitive tools. To check the requirements of the testing methods used and GUI interfaces review. Findings. The paper considers the concept of plagiarism issues of proliferation and classification. A review of existing systems to identify plagiarism: desktop applications, and online resources. Highlighting their functional characteristics, determine the format of the input and output data and constraints on them, customization features and access. Drill down system requirements is made. Originality. The authors proposed schemes complement the existing hierarchical taxonomy of plagiarism. Analysis of existing systems is done in terms of functionality and possibilities for use of large amounts of data. Practical value. The practical significance is determined by the breadth of the problem of plagiarism in various fields. In Ukraine, develops the legal framework for the fight against plagiarism, which requires the active solution development tasks, improvement and delivery of relevant software (PO. This work contributes to the solution of these problems. Review of existing programs, Anti-plagiarism, as well as study and research experience in the field and update the concept of plagiarism, the strategy allows it to identify more fully articulate to the functional performance requirements, the input and output of the developed software, as well as to identify the features of such software. The article focuses on the features of solving the

  5. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  6. Source Code Review Using Static Analysis Tools

    OpenAIRE

    Moiras, Stavros; Lüders, Stefan; Tsouvelekakis, Aimilios

    2015-01-01

    Abstract Many teams at CERN, develop their own software to solve their tasks. This software may be public or it may be used for internal purposes. It is of major importance for developers to know that their software is secure. Humans are able to detect bugs and vulnerabilities but it is impossible to discover everything when they need to read hundreds’ lines of code. As a result, computer scientists have developed tools which complete efficiently and within minut...

  7. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  8. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  9. PerfAndPubToolsTools for Software Performance Analysis and Publishing of Results

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2016-05-01

    Full Text Available PerfAndPubTools consists of a set of MATLAB/Octave functions for the post-processing and analysis of software performance benchmark data and producing associated publication quality materials.

  10. Genomic tools in cowpea breeding programs: status and perspectives

    Directory of Open Access Journals (Sweden)

    Ousmane eBoukar

    2016-06-01

    Full Text Available Cowpea is one of the most important grain legumes in sub-Saharan Africa (SSA. It provides strong support to the livelihood of small-scale farmers through its contributions to their nutritional security, income generation and soil fertility enhancement. Worldwide about 6.5 million metric tons of cowpea are produced annually on about 14.5 million hectares. The low productivity of cowpea is attributable to numerous abiotic and biotic constraints. The abiotic stress factors comprise drought, low soil fertility, and heat while biotic constraints include insects, diseases, parasitic weeds and nematodes. Cowpea farmers also have limited access to quality seeds of improved varieties for planting. Some progress has been made through conventional breeding at international and national research institutions in the last three decades. Cowpea improvement could also benefit from modern breeding methods based on molecular genetic tools. A number of advances in cowpea genetic linkage maps, and quantitative trait loci associated with some desirable traits such as resistance to Striga, Macrophomina, Fusarium wilt, bacterial blight, root-knot nematodes, aphids and foliar thrips have been reported. An improved consensus genetic linkage map has been developed and used to identify QTLs of additional traits. In order to take advantage of these developments single nucleotide polymorphism (SNP genotyping is being streamlined to establish an efficient workflow supported by genotyping support service (GSS-client interactions. About 1100 SNPs mapped on the cowpea genome were converted by LGC Genomics to KASP assays. Several cowpea breeding programs have been exploiting these resources to implement molecular breeding, especially for MARS and MABC, to accelerate cowpea variety improvement. The combination of conventional breeding and molecular breeding strategies, with workflow managed through the CGIAR breeding management system (BMS, promises an increase in the number of

  11. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  12. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  13. Plant databases and data analysis tools

    Science.gov (United States)

    It is anticipated that the coming years will see the generation of large datasets including diagnostic markers in several plant species with emphasis on crop plants. To use these datasets effectively in any plant breeding program, it is essential to have the information available via public database...

  14. The nitrogen footprint tool network: a multi-institution program ...

    Science.gov (United States)

    Anthropogenic sources of reactive nitrogen have local and global impacts on air and water quality and detrimental effects on human and ecosystem health. This paper uses the nitrogen footprint tool (NFT) to determine the amount of nitrogen (N) released as a result of institutional consumption. The sectors accounted for include food (consumption and the upstream production), energy, transportation, fertilizer, research animals, and agricultural research. The NFT is then used for scenario analysis to manage and track reductions to institution N footprints, which are driven by the consumption behaviors of both the institution itself and its constituent individuals. In this paper, the first seven institution N footprint results are presented. The institution NFT network aims to develop footprints for many institutions to encourage widespread upper-level management strategies that will create significant reductions in reactive N released to the environment. Energy use and food purchases are the two largest contributors to institution N footprints. Ongoing efforts by institutions to reduce greenhouse gas emissions also help to reduce the N footprint, but the impact of food production on N pollution has not been directly addressed by the higher-ed sustainability community. The NFT Network found that institutions could reduce their N footprints by optimizing food purchasing to reduce consumption of animal products and minimize food waste, as well as reducing dependence o

  15. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael Ignatieff

    2007-01-01

    XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...... of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey four very different applications: XML in Java, Java Servlets and JSP, transformations between XML and non-XML data, and XSLT....

  16. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  17. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-03

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology. Published by Elsevier B.V.

  18. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  19. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  20. Genome-tools: a flexible package for genome sequence analysis.

    Science.gov (United States)

    Lee, William; Chen, Swaine L

    2002-12-01

    Genome-tools is a Perl module, a set of programs, and a user interface that facilitates access to genome sequence information. The package is flexible, extensible, and designed to be accessible and useful to both nonprogrammers and programmers. Any relatively well-annotated genome available with standard GenBank genome files may be used with genome-tools. A simple Web-based front end permits searching any available genome with an intuitive interface. Flexible design choices also make it simple to handle revised versions of genome annotation files as they change. In addition, programmers can develop cross-genomic tools and analyses with minimal additional overhead by combining genome-tools modules with newly written modules. Genome-tools runs on any computer platform for which Perl is available, including Unix, Microsoft Windows, and Mac OS. By simplifying the access to large amounts of genomic data, genome-tools may be especially useful for molecular biologists looking at newly sequenced genomes, for which few informatics tools are available. The genome-tools Web interface is accessible at http://genome-tools.sourceforge.net, and the source code is available at http://sourceforge.net/projects/genome-tools.

  1. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  2. Software Tool for Real-Time Power Quality Analysis

    OpenAIRE

    CZIKER, A. C.; CHINDRIS, M. D.; Miron, A

    2013-01-01

    A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recov...

  3. Learning SQL Programming with Interactive Tools: From Integration to Personalization

    Science.gov (United States)

    Brusilovsky, Pete; Sosnovsky, Sergey; Yudelson, Michael V.; Lee, Danielle H.; Zadorozhny, Vladimir; Zhou, Xin

    2010-01-01

    Rich, interactive eLearning tools receive a lot of attention nowadays from both practitioners and researchers. However, broader dissemination of these tools is hindered by the technical difficulties of their integration into existing platforms. This article explores the technical and conceptual problems of using several interactive educational…

  4. Data Analysis Tools for Visualization Study

    Science.gov (United States)

    2015-08-01

    structured query language (SQL) relational database. I used PostgreSQL as the SQL application, called from Python programs running on a Linux Red Hat...determine whether the 2 means are statistically different. # Result[1] = probability = tail of the t distribution. Lower prob --> more likely that...1) tcalc = intv[1] # Output format: # # Field ’field’ - 95% confidence interval for means, 2- tailed : # "where1" mean

  5. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  6. Intergenerational Programming in Extension: Needs Assessment as Planning Tool.

    Science.gov (United States)

    Kaplan, Matthew; Liu, Shih-Tsen; Radhakrishna, Rama B.

    2003-01-01

    A needs assessment of 161 Extension educators in family and consumer science and 4-H/youth development received 28 responses indicating preferences regarding intergenerational program content and delivery format. Results were used to develop curriculum and program delivery strategies and begin planning for a statewide intergenerational program.…

  7. Understanding Tools and Practices for Distributed Pair Programming

    NARCIS (Netherlands)

    Schümmer, T.; Lukosch, S.G.

    2009-01-01

    When considering the principles for eXtreme Programming, distributed eXtreme Programming, especially distributed pair programming, is a paradox predetermined to failure. However, global software development as well as the outsourcing of software development are integral parts of software projects.

  8. Dynamic Characteristics Analysis of a Micro Grinding Machine Tool

    OpenAIRE

    Li Wei; Li Beizhi; Yang Jianguo

    2017-01-01

    Developing miniaturizedultraprecision machine tools (MUMTs) is a rational approach to manufacture micro/mesoscale mechanical components. The dynamic characteristics of MUMTs can be different from that of conventional machine tools, because of its miniaturized structure, downsized components and more flexible assembly. Their dynamics behaviorneeds to be studied. In this paper, the dynamic characteristics of a verticalultraprecisionmicro grinding machine tool were analyzed. An analysis model wa...

  9. Dynamic Characteristics Analysis of a Micro Grinding Machine Tool

    Directory of Open Access Journals (Sweden)

    Li Wei

    2017-01-01

    Full Text Available Developing miniaturizedultraprecision machine tools (MUMTs is a rational approach to manufacture micro/mesoscale mechanical components. The dynamic characteristics of MUMTs can be different from that of conventional machine tools, because of its miniaturized structure, downsized components and more flexible assembly. Their dynamics behaviorneeds to be studied. In this paper, the dynamic characteristics of a verticalultraprecisionmicro grinding machine tool were analyzed. An analysis model was established based on finite element method(FEM. Modal analysis was then conducted to obtain vibration mode characteristics of the machine tool, and then the harmonic response analysis was applied to machine tool for verifying its mechanical behaviour. This work helps MUMT developers make a better product at the early design stage with lower cost and development time

  10. New Tools in Nonlinear System Analysis

    National Research Council Canada - National Science Library

    Megretski, Alexandre

    2003-01-01

    This project was aimed at developing novel theories for the analysis and design of systems exhibiting essentially nonlinear behavior, such as systems utilizing quantized decision making, periodic orbits, switching, etc...

  11. Immunoglobulin analysis tool: a novel tool for the analysis of human and mouse heavy and light chain transcripts.

    Science.gov (United States)

    Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F; Ippolito, Gregory C; Zemlin, Michael

    2012-01-01

    Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft(®) Excel(®) based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts - such as the usage of germline gene segments, or the length and composition of the CDR-3 region - IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai's H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel(®) 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set

  12. Application of cleaner production tools and failure modes and effects analysis in pig slaughterhourses

    OpenAIRE

    Fonseca, J. M.; A. P. Peres

    2017-01-01

    Cleaner production programs (CP) and Failure Modes and Effects Analysis (FMEA) are tools used to improve the sustainability of industries, ensuring greater profitability, quality, reliability and safety of their products and services. The meat industry is among the most polluting industries because of the large amounts of organic waste produced during meat processing. The objective of this study was to combine the CP and FMEA tools and to apply them in a pig slaughterhouse in order to detect ...

  13. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Directory of Open Access Journals (Sweden)

    Boris Vassilev

    Full Text Available A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  14. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  15. Program Design Report of the CNC Machine Tool(V-1)

    Energy Technology Data Exchange (ETDEWEB)

    Youm, Ki Un; Moon, J. S.; Lee, I. B.; Youn, J. H.

    2010-08-15

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology

  16. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car......, and automata techniques....

  17. Small machine tools for small workpieces final report of the DFG priority program 1476

    CERN Document Server

    Sanders, Adam

    2017-01-01

    This contributed volume presents the research results of the program “Small machine tools for small work pieces” (SPP 1476), funded by the German Research Society (DFG). The book contains the final report of the priority program, presenting novel approached for size-adapted, reconfigurable micro machine tools. The target audience primarily comprises research experts and practitioners in the field of micro machine tools, but the book may also be beneficial for graduate students.

  18. Central Africa Regional Program for the Environment Information Management Tool

    Data.gov (United States)

    US Agency for International Development — The CARPE Information Management Tool (CARPE IMT), available in both French and English, organizes information and reports from its partners for the 12 CARPE/CBFP...

  19. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  20. RNAmute: RNA secondary structure mutation analysis tool

    Directory of Open Access Journals (Sweden)

    Barash Danny

    2006-04-01

    Full Text Available Abstract Background RNAMute is an interactive Java application that calculates the secondary structure of all single point mutations, given an RNA sequence, and organizes them into categories according to their similarity with respect to the wild type predicted structure. The secondary structure predictions are performed using the Vienna RNA package. Several alternatives are used for the categorization of single point mutations: Vienna's RNAdistance based on dot-bracket representation, as well as tree edit distance and second eigenvalue of the Laplacian matrix based on Shapiro's coarse grain tree graph representation. Results Selecting a category in each one of the processed tables lists all single point mutations belonging to that category. Selecting a mutation displays a graphical drawing of the single point mutation and the wild type, and includes basic information such as associated energies, representations and distances. RNAMute can be used successfully with very little previous experience and without choosing any parameter value alongside the initial RNA sequence. The package runs under LINUX operating system. Conclusion RNAMute is a user friendly tool that can be used to predict single point mutations leading to conformational rearrangements in the secondary structure of RNAs. In several cases of substantial interest, notably in virology, a point mutation may lead to a loss of important functionality such as the RNA virus replication and translation initiation because of a conformational rearrangement in the secondary structure.

  1. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...... model checking, model-checking tools, symbolic model checking, visual tools, and verification of critical systems....

  2. Spreadsheet as a tool of engineering analysis

    Energy Technology Data Exchange (ETDEWEB)

    Becker, M.

    1985-11-01

    In engineering analysis, problems tend to be categorized into those that can be done by hand and those that require the computer for solution. The advent of personal computers, and in particular, the advent of spreadsheet software, blurs this distinction, creating an intermediate category of problems appropriate for use with interactive personal computing.

  3. A Family of Tools for Supporting the Learning of Programming

    Directory of Open Access Journals (Sweden)

    Guido Rößling

    2010-04-01

    Full Text Available Both learning how to program and understanding algorithms or data structures are often difficult. This paper presents three complementary approaches that we employ to help our students in learning to program, especially during the first term of their study. We use a web-based programming task database as an easy and risk-free environment for taking the first steps in programming Java. The Animal algorithm visualization system is used to visualize the dynamic behavior of algorithms and data structures. We complement both approaches with tutorial videos on using the Eclipse IDE. We also report on the experiences with this combined approach.

  4. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  5. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    Science.gov (United States)

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  6. Analysis of Logic Programs Using Regular Tree Languages

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2012-01-01

    The eld of nite tree automata provides fundamental notations and tools for reasoning about set of terms called regular or recognizable tree languages. We consider two kinds of analysis using regular tree languages, applied to logic programs. The rst approach is to try to discover automatically...

  7. Program Suite for Conceptual Designing of Parallel Mechanism-Based Robots and Machine Tools

    Directory of Open Access Journals (Sweden)

    Slobodan Tabaković

    2013-06-01

    Full Text Available In the development of robots and machine tools, in addition to conventional and serial structures, parallel mechanism-based kinematic structures have been used over a longer period of time. Aside from a number of advantages, the irregular shape and relatively small dimensions of the workspace formed by parallel mechanisms rank among the major weaknesses of their application. Accordingly, this fact has to be taken into consideration in the process of designing parallel mechanism-based robots or machine tools. This paper describes the categorization of criteria for the conceptual design of parallel mechanism-based robots or machine tools, resulting from workspace analysis as well as the procedure of their defining. Furthermore, it also presents the designing methodology that was implemented into the program for the creation of a robot or machine tool space model and the optimization of the resulting solution. For verification of the criteria and the programme suite, three common (conceptually different mechanisms with a similar mechanical structure and kinematic characteristics were used.

  8. Using the Program Sustainability Assessment Tool to Assess and Plan for Sustainability

    Science.gov (United States)

    Mainor, Avia; Moreland-Russell, Sarah; Maier, Ryan C.; Brossart, Laura; Luke, Douglas A.

    2014-01-01

    Implementing and growing a public health program that benefits society takes considerable time and effort. To ensure that positive outcomes are maintained over time, program managers and stakeholders should plan and implement activities to build sustainability capacity within their programs. We describe a 3-part sustainability planning process that programs can follow to build their sustainability capacity. First, program staff and stakeholders take the Program Sustainability Assessment Tool to measure their program’s sustainability across 8 domains. Next, managers and stakeholders use results from the assessment to inform and prioritize sustainability action planning. Lastly, staff members implement the plan and keep track of progress toward their sustainability goals. Through this process, staff can more holistically address the internal and external challenges and pressures associated with sustaining a program. We include a case example of a chronic disease program that completed the Program Sustainability Assessment Tool and engaged in program sustainability planning. PMID:24456644

  9. ROOT CAUSE ANALYSIS PROGRAM MANUAL

    Energy Technology Data Exchange (ETDEWEB)

    Gravois, Melanie C.

    2007-05-02

    Root Cause Analysis (RCA) identifies the cause of an adverse condition that, if corrected, will preclude recurrence or greatly reduce the probability of recurrence of the same or similar adverse conditions and thereby protect the health and safety of the public, the workers, and the environment. This procedure sets forth the requirements for management determination and the selection of RCA methods and implementation of RCAs that are a result of significant findings from Price-Anderson Amendments Act (PAAA) violations, occurrences/events, Significant Adverse Conditions, and external oversight Corrective Action Requests (CARs) generated by the Office of Enforcement (PAAA headquarters), the U.S. Environmental Protection Agency, and other oversight entities against Lawrence Berkeley National Laboratory (LBNL). Performance of an RCA may result in the identification of issues that should be reported in accordance with the Issues Management Program Manual.

  10. A measuring tool for tree-rings analysis

    Science.gov (United States)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  11. A new Bayesian Earthquake Analysis Tool (BEAT)

    Science.gov (United States)

    Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin

    2017-04-01

    Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one

  12. New analysis tools for observational studies.

    Science.gov (United States)

    Landewé, R B M

    2015-03-01

    Observational studies, which are very common in rheumatology, usually follow a selected group of patients for a predetermined period of time, or infinitely, with regard to a certain outcome. Such an outcome could be a "score" reflecting an important aspect of the disease (e.g., a disease activity score), or an "event" (e.g., myocardial infarction). Rather than investigating the efficacy of a particular treatment, observational studies serve to investigate clinical associations between different (outcome) variables. Confounding, which may spuriously inflate or reduce the magnitude of a particular association, is an inherent risk in observational studies. The modern analytical approach of an observational study depends on the study question, the study design, and on how the outcome of interest has been assessed. The current article discusses several aspects of the analytical approach and requirements of the database. The focus is on longitudinal analysis, subgroup analysis, and adjustment for confounding. It is concluded that the appropriate analysis of an observational study should be a close collaboration between the clinical researcher with sufficient epidemiological knowledge and the expert statistician with sufficient interest in clinical questions.

  13. Using Concept Mapping as as Tool for Program Theory Development

    Science.gov (United States)

    Orsi, Rebecca

    2011-01-01

    The purpose of this methodological study is to explore how well a process called "concept mapping" (Trochim, 1989) can articulate the theory which underlies a social program. Articulation of a program's theory is a key step in completing a sound theory based evaluation (Weiss, 1997a). In this study, concept mapping is used to…

  14. Programming Embedded Systems With C and GNU Development Tools

    CERN Document Server

    Barr, Michael

    2009-01-01

    Whether you're writing your first embedded program, designing the latest generation of hand-held whatchamacalits, or managing the people who do, this book is for you. Programming Embedded Systems will help you develop the knowledge and skills you need to achieve proficiency with embedded software.

  15. [Emotional Intelligence Index: a tool for the routine assessment of mental health promotion programs in schools].

    Science.gov (United States)

    Veltro, Franco; Ialenti, Valentina; Morales García, Manuel Alejandro; Gigantesco, Antonella

    2016-01-01

    After critical examination of several aspects relating to the evaluation of some dimensions of emotional intelligence through self-assessment tools, is described the procedure of construction and validation of an Index for its measurement, conceived only for the routine assessment of health promotion programs mental in schools that include among their objectives the improvement of emotional intelligence specifically "outcome-oriented". On the basis of the two most common international tools, are listed 27 items plus 6 of control, illustrated two Focus Group (FG) of students (face validity). The scale obtained by FG was administered to 300 students, and the results were submitted to factorial analysis (construct validity). It was also evaluated the internal consistency with Cronbach's Alpha and studied concurrent validity with the emotional quotient inventory, a scale of perceived self-efficacy and a stress test rating. From the analysis of FG all the original items were modified, deleted 4, and reduced the encoding system from 6 to 4 levels of Likert scale. Of the 23 items included in the analysis have emerged five factors (intra-psychic dimension, interpersonal, impulsivity, adaptive coping, sense of self-efficacy) for a total of 15 items. Very satisfactory were the results of the validation process of internal consistency (0.72) and the concurrent validity. The results are positive. It is obtained in fact the shortest routine assessment tool currently available in Italy which constitutes a real Index, for which compilation are required on average 3 minutes. Is emphasized the characteristic of an Index, and not of questionnaire or interview for clinical use, highlighting the only specific use for mental health promotion programs in schools.

  16. Discourse Analysis: A Tool for Helping Educators to Teach Science

    Directory of Open Access Journals (Sweden)

    Katerina Plakitsi

    2016-11-01

    Full Text Available This article refers to a part of a collaborative action research project in three elementary science classrooms. The project aims at the transformation of the nature and type of teachers' discursive practices into more collaborative inquiries. The basic strategy is to give the teachers the opportunity to analyze their discourse using a three-dimensional context of analysis. The teachers analyzed their discursive repertoires when teaching science. They studied the companion meaning, i.e., the different layers of explicit and tacit messages they communicate about Nature of Science (NoS, Nature of Teaching (NoT, and Nature of Language (NoL. The question investigated is the following: Could an action research program, which involves teachers in the analysis of their own discursive practices, lead to the transformation of discourse modes that take place in the science classrooms to better communicate aspects of NoS, NoT and NoL in a collaborative, inquiry-based context? Results indicate that the teachers' involvement in their discourse analysis led to a transformation in the discursive repertoires in their science classrooms. Gradually, the teachers' companion meanings that were created, implicitly/explicitly, from the dialogues taking place during science lessons were more appropriate for the establishment of a productive collaborative inquiry learning context. We argue that discourse analysis could be used for research purposes, as a training medium or as a reflective tool on how teachers communicate science. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs170168

  17. Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Brian; Melaina, Marc; Penev, Michael

    2016-06-08

    This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.

  18. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  19. Analysis of Online Quizzes as a Teaching and Assessment Tool

    Science.gov (United States)

    Salas-Morera, Lorenzo; Arauzo-Azofra, Antonio; García-Hernández, Laura

    2012-01-01

    This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an…

  20. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  1. The analysis of the functionality of modern systems, methods and scheduling tools

    Directory of Open Access Journals (Sweden)

    Abramov Ivan

    2016-01-01

    Full Text Available Calendar planning is a key tool for efficient management applied in many industries: power, oil & gas, metallurgy, and construction. As a result of the growing complexity of projects and arising need for improvement of their efficiency, a large number of software tools for high-quality calendar planning appear. Construction companies are facing the challenge of optimum selection of such tools (programs for distribution of limited resources in time.The article provides analysis of the main software packages and their capabilities enabling improvement of project implementation efficiency.

  2. Study Abroad Programs as Tools of Internationalization: Which Factors Influence Hungarian Business Students to Participate?

    Science.gov (United States)

    Huják, Janka

    2015-01-01

    The internationalization of higher education has been on the agenda for decades now all over the world. Study abroad programs are undoubtedly tools of the internationalization endeavors. The ERASMUS Student Mobility Program is one of the flagships of the European Union's educational exchange programs implicitly aiming for the internationalization…

  3. Surface Grinder Operator. Instructor's Guide. Part of Single-Tool Skills Program. Machine Industries Occupations.

    Science.gov (United States)

    New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.

    This course, the second one to be published in what is expected to be a series of instructor's guides in the Single-Tool Skills Program, is expected to help meet the need for trained operators in metalworking and is designed for use in the adult education programs of school districts, in Manpower Development and Training Programs, and in secondary…

  4. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  5. HANFORDS PUBLIC TOUR PROGRAM - AN EXCELLENT EDUCATIONAL TOOL

    Energy Technology Data Exchange (ETDEWEB)

    SINCLAIR KM

    2010-12-07

    Prior to 2001, the Department of Energy (DOE) sponsored limited tours of the Hanford Site for the public, but discontinued the program after the 9/11 terrorist attacks on the U.S. In 2003, DOE's Richland Operations Office (DOE-RL) requested the site's prime contractor to reinstate the public tour program starting in 2004 under strict controls and security requirements. The planning involved a collaborative effort among the security, safety and communications departments of DOE-RL and the site's contracting companies. This paper describes the evolution of, and enhancements to, Hanford's public tours, including the addition of a separate tour program for the B Reactor, the first full-scale nuclear reactor in the world. Topics included in the discussion include the history and growth of the tour program, associated costs, and visitor surveys and assessments.

  6. What linear programming contributes: world food programme experience with the "cost of the diet" tool.

    Science.gov (United States)

    Frega, Romeo; Lanfranco, Jose Guerra; De Greve, Sam; Bernardini, Sara; Geniez, Perrine; Grede, Nils; Bloem, Martin; de Pee, Saskia

    2012-09-01

    Linear programming has been used for analyzing children's complementary feeding diets, for optimizing nutrient adequacy of dietary recommendations for a population, and for estimating the economic value of fortified foods. To describe and apply a linear programming tool ("Cost of the Diet") with data from Mozambique to determine what could be cost-effective fortification strategies. Based on locally assessed average household dietary needs, seasonal market prices of available food products, and food composition data, the tool estimates the lowest-cost diet that meets almost all nutrient needs. The results were compared with expenditure data from Mozambique to establish the affordability of this diet by quintiles of the population. Three different applications were illustrated: identifying likely "limiting nutrients," comparing cost effectiveness of different fortification interventions at the household level, and assessing economic access to nutritious foods. The analysis identified iron, vitamin B2, and pantothenic acid as "limiting nutrients." Under the Mozambique conditions, vegetable oil was estimated as a more cost-efficient vehicle for vitamin A fortification than sugar; maize flour may also be an effective vehicle to provide other constraining micronutrients. Multiple micronutrient fortification of maize flour could reduce the cost of the "lowest-cost nutritious diet" by 18%, but even this diet can be afforded by only 20% of the Mozambican population. Within the context of fortification, linear programming can be a useful tool for identifying likely nutrient inadequacies, for comparing fortification options in terms of cost effectiveness, and for illustrating the potential benefit of fortification for improving household access to a nutritious diet.

  7. Roles of Variables and Program Analysis

    OpenAIRE

    Bishop, Craig; Johnson, Colin G.

    2005-01-01

    The idea of roles of variables is to provide a vocabulary for describing the way in which variables are used by experienced programmers. This paper presents work on a system that is designed to automatically check students' role assignments in simple procedural programming. This is achieved by applying program analysis techniques, in particular program slicing and data flow analysis, to programs that students have written and annotated with role assignments.

  8. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  9. Universal Tool Grinder Operator Instructor's Guide. Part of Single-Tool Skills Program Machine Industries Occupations.

    Science.gov (United States)

    New York State Education Dept., Albany. Div. of Curriculum Development.

    The document is an instructor's guide for a course on universal tool grinder operation. The course is designed to train people in making complicated machine setups and precision in the grinding operations and, although intended primarily for adult learners, it can be adapted for high school use. The guide is divided into three parts: (1) the…

  10. Design on an enhanced interactive satellite communications system analysis program

    Science.gov (United States)

    Andersen, Kevin Robert

    1991-09-01

    This thesis describes the design of a user-friendly interactive satellite communications analysis program for use on a personal computer. The user inputs the various parameters of a satellite orbit, ground station location and communications equipment. The output generated allows a user to view the satellite ground trace and footprint, calculate satellite rise and set times, and analyze the performance of the communications link. The link analysis allows the user to input various signal losses and jamming interference. Care was taken to ensure that the program is simple to operate and that it provides on-line help for each segment. A principle goal of this thesis effort is to provide an educational tool that familiarizes the user with the communications segment of a space system. The initial success of the program based upon student response validates the use of object-oriented like software tools that enhance user understanding of complex subjects.

  11. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1996-01-01

    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  12. Gender analysis of use of participatory tools among extension ...

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  13. Designed tools for analysis of lithography patterns and nanostructures

    Science.gov (United States)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  14. UDAT: A multi-purpose data analysis tool

    Science.gov (United States)

    Shamir, Lior

    2017-04-01

    UDAT is a pattern recognition tool for mass analysis of various types of data, including image and audio. Based on its WND-CHARM (ascl:1312.002) prototype, UDAT computed a large set of numerical content descriptors from each file it analyzes, and selects the most informative features using statistical analysis. The tool can perform automatic classification of galaxy images by training with annotated galaxy images. It also has unsupervised learning capabilities, such as query-by-example of galaxies based on morphology. That is, given an input galaxy image of interest, the tool can search through a large database of images to retrieve the galaxies that are the most similar to the query image. The downside of the tool is its computational complexity, which in most cases will require a small or medium cluster.

  15. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  16. Utility green pricing programs: A statistical analysis of program effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, Ryan; Olson, Scott; Bird, Lori; Swezey, Blair

    2004-02-01

    Development of renewable energy. Such programs have grown in number in recent years. The design features and effectiveness of these programs varies considerably, however, leading a variety of stakeholders to suggest specific marketing and program design features that might improve customer response and renewable energy sales. This report analyzes actual utility green pricing program data to provide further insight into which program features might help maximize both customer participation in green pricing programs and the amount of renewable energy purchased by customers in those programs. Statistical analysis is performed on both the residential and non-residential customer segments. Data comes from information gathered through a questionnaire completed for 66 utility green pricing programs in early 2003. The questionnaire specifically gathered data on residential and non-residential participation, amount of renewable energy sold, program length, the type of renewable supply used, program price/cost premiums, types of consumer research and program evaluation performed, different sign-up options available, program marketing efforts, and ancillary benefits offered to participants.

  17. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets......, as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area...

  18. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  19. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction......This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...

  20. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  1. Construction and validation of a surgical skills assessment tool for general surgery residency program

    Directory of Open Access Journals (Sweden)

    Elizabeth Gomes dos Santos

    Full Text Available Objective: To develop and validate an instrument for measuring the acquisition of technical skills in conducting operations of increasing difficulty for use in General Surgery Residency (GSR programs. Methods: we built a surgical skills assessment tool containing 11 operations in increasing levels of difficulty. For instrument validation we used the face validaity method. Through an electronic survey tool (Survey MonKey(r we sent a questionnaire to Full and Emeritus members of the Brazilian College of Surgeons - CBC - all bearers of the CBC Specialist Title. Results: Of the 307 questionnaires sent we received 100 responses. For the analysis of the data collected we used the Cronbach's alpha test. We observed that, in general, the overall alpha presented with values near or greater than 0.70, meaning good consistency to assess their points of interest. Conclusion: The evaluation instrument built was validated and can be used as a method of assessment of technical skill acquisition in the General Surgery Residency programs in Brazil.

  2. Validation of the Malnutrition Screening Tool for use in a Community Rehabilitation Program.

    Science.gov (United States)

    Leipold, Claire E; Bertino, Shaylyn B; L'Huillier, Heather M; Howell, Paula M; Rosenkotter, Michelina

    2018-02-01

    The aim of the present study was to determine if the Malnutrition Screening Tool (MST) is valid for use within the Community Rehabilitation Program (CRP) setting. Secondary outcome measures were to assess malnutrition prevalence in the CRP population and to determine trends between malnutrition and age, body mass index (BMI) and falls history. This study used a cross-sectional design. All clients admitted to a Melbourne metropolitan CRP during the study period had the MST completed at intake. A total of 160 participants were then selected at random and a Subjective Global Assessment (SGA) was completed by an experienced dietitian. Participants were classified as well nourished or malnourished, and this result was compared to their MST score. Data analysis was completed to determine the predictive value of the MST compared to SGA, which was expressed using sensitivity, specificity, positive and negative predictive values. Out of the 160 participants, 34.0% were identified as malnourished. The MST achieved a sensitivity of 72.2% and a specificity of 83.8% with positive predictive value of 69.6% and negative predictive value of 85.4% compared to the SGA. Participants in the malnourished group were older and had a lower BMI (P valid screening tool for use in this population and has relatively low burden to complete. Consequently, the MST could be included in the client initial needs identification to be completed when admitted to the program. © 2017 Dietitians Association of Australia.

  3. Social Networking Tools to Facilitate Cross-Program Collaboration

    Science.gov (United States)

    Wallace, Paul; Howard, Barbara

    2010-01-01

    Students working on a highly collaborative project used social networking technology for community building activities as well as basic project-related communication. Requiring students to work on cross-program projects gives them real-world experience working in diverse, geographically dispersed groups. An application used at Appalachian State…

  4. Development of a peer teaching-assessment program and a peer observation and evaluation tool.

    Science.gov (United States)

    Trujillo, Jennifer M; DiVall, Margarita V; Barr, Judith; Gonyeau, Michael; Van Amburgh, Jenny A; Matthews, S James; Qualters, Donna

    2008-12-15

    To develop a formalized, comprehensive, peer-driven teaching assessment program and a valid and reliable assessment tool. A volunteer taskforce was formed and a peer-assessment program was developed using a multistep, sequential approach and the Peer Observation and Evaluation Tool (POET). A pilot study was conducted to evaluate the efficiency and practicality of the process and to establish interrater reliability of the tool. Intra-class correlation coefficients (ICC) were calculated. ICCs for 8 separate lectures evaluated by 2-3 observers ranged from 0.66 to 0.97, indicating good interrater reliability of the tool. Our peer assessment program for large classroom teaching, which includes a valid and reliable evaluation tool, is comprehensive, feasible, and can be adopted by other schools of pharmacy.

  5. A Rapid Assessment Tool for affirming good practice in midwifery education programming.

    Science.gov (United States)

    Fullerton, Judith T; Johnson, Peter; Lobe, Erika; Myint, Khine Haymar; Aung, Nan Nan; Moe, Thida; Linn, Nay Aung

    2016-03-01

    to design a criterion-referenced assessment tool that could be used globally in a rapid assessment of good practices and bottlenecks in midwifery education programs. a standard tool development process was followed, to generate standards and reference criteria; followed by external review and field testing to document psychometric properties. review of standards and scoring criteria were conducted by stakeholders around the globe. Field testing of the tool was conducted in Myanmar. eleven of Myanmar׳s 22 midwifery education programs participated in the assessment. the clinimetric tool was demonstrated to have content validity and high inter-rater reliability in use. a globally validated tool, and accompanying user guide and handbook are now available for conducting rapid assessments of compliance with good practice criteria in midwifery education programming. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Using the Spec# Language, Methodology, and Tools to Write Bug-Free Programs

    Science.gov (United States)

    Leino, K. Rustan M.; Müller, Peter

    Spec# is a programming system for the development of correct programs. It consists of a programming language, a verification methodology, and tools. The Spec# language extends C# with contracts, which allow programmers to document their design decisions in the code. The verification methodology provides rules and guidelines for how to use the Spec# features to express and check properties of interesting implementations. Finally, the tool support consists of a compiler that emits run-time checks for many contracts and a static program verifier that attempts to prove automatically that an implementation satisfies its specification. These lecture notes teach the use of the Spec# system, focusing on specification and static verification.

  7. SDA-Based Diagnostic and Analysis Tools for Collider Run II

    CERN Document Server

    Papadimitriou, Vaia; Lebrun, Paul; Panacek, S; Slaughter, Anna Jean; Xiao, Aimin

    2005-01-01

    Operating and improving the understanding of the Fermilab Accelerator Complex for the colliding beam experiments requires advanced software methods and tools. The Shot Data Acquisition and Analysis (SDA) has been developed to fulfill this need. Data is stored in a relational database, and is served to programs and users via Web-based tools. Summary tables are systematically generated during and after a store. These tables, the Supertable, and the Recomputed Emittances and Recomputed Intensity tables are discussed here. This information is also accesible in JAS3 (Java Analysis Studio version 3).

  8. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...... of the discovered feature-code relations through a number of analytical views....

  9. Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)

    Science.gov (United States)

    McCoy, James R.

    2003-01-01

    A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.

  10. Ethnomodeling as a Pedagogical Tool for the Ethnomathematics Program

    Directory of Open Access Journals (Sweden)

    Milton Rosa

    2010-08-01

    Full Text Available Mathematics used outside of the school may be considered as a process of ethnomodeling rather than a mere process of manipulation of numbers and procedures. The application of ethnomathematical techniques and the tools of modeling allow us to see a different reality and give us insight into mathematics done in a holisticway. In this perspective, the pedagogical approach that connects the cultural aspects of mathematics with its academic aspects is denominated ethnomodeling, which is a process of translation and elaboration of problems and questions taken from systems that are part of the students’ reality.

  11. MODELING OF ANIMATED SIMULATIONS BY MAXIMA PROGRAM TOOLS

    Directory of Open Access Journals (Sweden)

    Nataliya O. Bugayets

    2015-06-01

    Full Text Available The article deals with the methodical features in training of computer simulation of systems and processes using animation. In the article the importance of visibility of educational material that combines sensory and thinking sides of cognition is noted. The concept of modeling and the process of building models has been revealed. Attention is paid to the development of skills that are essential for effective learning of animated simulation by visual aids. The graphical environment tools of the computer mathematics system Maxima for animated simulation are described. The examples of creation of models animated visual aids and their use for the development of research skills are presented.

  12. MoDOT pavement preservation research program volume IV, pavement evaluation tools-data collection methods.

    Science.gov (United States)

    2015-10-01

    The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...

  13. Family Advocacy Program Standards and Self-Assessment Tool

    Science.gov (United States)

    1992-08-01

    A facility designated for temporary, emergency housing for victims of abuse. Its use is normally limited to female victims of spouse abuse and her...sexual activity such as pornography cr prostitution in which the offender does not have direct physical contact with the child. 3. Rape and Intercourse...Support Groups. 8. Parent and/or Teen Groups. 4-10 Educationally-based programs are those whose intent is to convey informa- tion and awareness to the

  14. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    Science.gov (United States)

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  15. Automated Program Analysis for Cybersecurity (APAC)

    Science.gov (United States)

    2016-07-14

    person or corporation; or convey any rights or permission to manufacture , use , or sell any patented invention that may relate to them. This...UCSB, and Utah) requested that they be allowed to pre-process the applications used in Experiment 3B. The automated analysis tools from these teams...virtual machine server where participating teams could upload their tools and start their automated analysis. Teams using pre-computation were instructed

  16. Constructing Social Networks From Secondary Storage With Bulk Analysis Tools

    Science.gov (United States)

    2016-06-01

    but we also have various amounts of file formats, encryption, and lack of training on programs that assist with analyzing data. Garfinkel’s paper...last issue addressed by Garfinkel is that there is not enough on-the-job training of digital forensic tools. This thesis has taken these issues into...wanted to provide an aesthetically pleasing and useful format for analysts. Furthermore, we wanted a method that was intuitive and consistent so analyst

  17. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  18. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  19. Software Tool for Real-Time Power Quality Analysis

    Directory of Open Access Journals (Sweden)

    CZIKER, A. C.

    2013-11-01

    Full Text Available A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recovery algorithm is applied, the harmonic analysis can be made even if voltage dips or swells appear. The virtual instrument input data can be recorded or online signals; the last ones being get through a data acquisition board. The virtual instrument was tested using both virtually created and real signals from measurements performed in distribution networks. The paper contains a numeric example made on a synthetic digital signal and an analysis made in real-time.

  20. Proteomic Tools for the Analysis of Cytoskeleton Proteins.

    Science.gov (United States)

    Scarpati, Michael; Heavner, Mary Ellen; Wiech, Eliza; Singh, Shaneen

    2016-01-01

    Proteomic analyses have become an essential part of the toolkit of the molecular biologist, given the widespread availability of genomic data and open source or freely accessible bioinformatics software. Tools are available for detecting homologous sequences, recognizing functional domains, and modeling the three-dimensional structure for any given protein sequence. Although a wealth of structural and functional information is available for a large number of cytoskeletal proteins, with representatives spanning all of the major subfamilies, the majority of cytoskeletal proteins remain partially or totally uncharacterized. Moreover, bioinformatics tools provide a means for studying the effects of synthetic mutations or naturally occurring variants of these cytoskeletal proteins. This chapter discusses various freely available proteomic analysis tools, with a focus on in silico prediction of protein structure and function. The selected tools are notable for providing an easily accessible interface for the novice, while retaining advanced functionality for more experienced computational biologists.

  1. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design.......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  2. Tools for T-RFLP data analysis using Excel.

    Science.gov (United States)

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  3. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  4. Statis Program Analysis for Reliable, Trusted Apps

    Science.gov (United States)

    2017-02-01

    devices. App stores also provide a tempting vector for an attacker. An attacker can take advantage of bugdoors (software defects that permit...STATIC PROGRAM ANALYSIS FOR RELIABLE, TRUSTED APPS UNIVERSITY OF WASHINGTON FEBRUARY 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...STATIC PROGRAM ANALYSIS FOR RELIABLE, TRUSTED APPS 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA8750-12-2-0107 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR

  5. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    Directory of Open Access Journals (Sweden)

    Sean Cameron Booth

    2013-01-01

    Full Text Available Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before.

  6. COMPUTATIONAL TOOLS FOR THE SECONDARY ANALYSIS OF METABOLOMICS EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Sean C. Booth

    2013-01-01

    Full Text Available Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before.

  7. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  8. Determination of Satisfaction Index as a tool in evaluation of CME Program

    Directory of Open Access Journals (Sweden)

    Kuldeep Singh

    2014-03-01

    Full Text Available Continuing Medical Education is an indispensable part of physician's learning. Welldesigned program based on andragogy principles can enhance learning by motivatingthe learner and providing platform to encourage self directed learning. The presentstudy aimed to explore the impact of program “NAMS-AIIMS Regional Symposiumon Sleep Medicine” in changing the behavior and attitude of participants using“Satisfaction Index” and descriptive analysis of responses as evaluation tools forprogram effectiveness. This descriptive cross sectional study captured the response ofparticipants through a pre-tested and validated questionnaire administered at the end ofsymposium. The result showed almost equal sex distribution (M: F- 27: 34 withmajority being UG students (86%. Reliability of data showed Cronbach's Alpha of0.98 indicating high reliability. Satisfaction index (SI calculated as per WHOEducational Handbook for Health Personnel showed highest satisfaction for conduciveenvironment of symposium (87.87 % followed by provision for time to seekclarifications (87.21%, provision of appropriate Learning Resource material (85.90% and handling of critical comments by organizers (85.57%. Descriptive analysisshowed majority responses as highly positive to our questionnaire with suggestions formore such activity, inclusion of clinical cases and other aspects of practical relevance.Key words : evaluation, program, satisfaction index, Kirkpatrick Model, studentsatisfaction, adult learning, Knowles Theory.

  9. Development of a Preventive Maintenance Program for Tooling Used in Powder Slush Molding

    Energy Technology Data Exchange (ETDEWEB)

    Lara-Curzio, Edgar [ORNL; Rios, Orlando [ORNL; Marquez Rossy, Andres E [ORNL

    2016-07-19

    ORNL collaborated with Faurecia Interior Systems to investigate the feasibility of developing a thermomagnetic preventive maintenance program for nickel tooling used in powder slush molding. It was found that thermal treatments at temperatures greater than 500°C can anneal strain hardening in nickel tooling and a range of temperatures and times for effective thermal annealing were identified. It was also observed that magnetic fields applied during thermal annealing do not alter the kinetics of strain hardening annealing. The results obtained in this investigation provide a foundation for establishing a preventive maintenance program for nickel tooling.

  10. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for public data releases and outreach

    CERN Document Server

    Banda, Tea; CERN. Geneva. EP Department

    2016-01-01

    The project consists in the initial development of ROOT notebooks for a Z boson analysis in C++ programming language that will allow students and researches to perform fast and very useful data analysis, using ATLAS public data and Monte- Carlo simulations. Several tools are considered: ROOT Data Analysis Frame- work, Jupyter Notebook Technology and CERN-ROOT computing service so-called SWAN.

  11. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    , analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...... of a platform and as a basis for efficient code generation. In the third application, the Service Relation Model and the concept of consistency are used to guide an automated procedure for designing systems composed of components....

  12. Bring Your Own Device (BYOD) Programs in the Classroom: Teacher Use, Equity, and Learning Tools

    Science.gov (United States)

    Fincher, Derrel

    2016-01-01

    This study explores teacher perceptions of Bring Your Own Device (BYOD) programs in the classroom, with a focus on teacher use, student equity of access, and student ability to use their devices as learning tools. While one-to-one laptop programs (students assigned identical school-owned laptop or tablet) has an extensive body of literature behind…

  13. Program Evaluation: The Board Game--An Interactive Learning Tool for Evaluators

    Science.gov (United States)

    Febey, Karen; Coyne, Molly

    2007-01-01

    The field of program evaluation lacks interactive teaching tools. To address this pedagogical issue, the authors developed a collaborative learning technique called Program Evaluation: The Board Game. The authors present the game and its development in this practitioner-oriented article. The evaluation board game is an adaptable teaching tool…

  14. The Assessment of Afterschool Program Practices Tool (APT): Findings from the APT Validation Study

    Science.gov (United States)

    Tracy, Allison; Surr, Wendy; Richer, Amanda

    2012-01-01

    The Assessment of Afterschool Program Practices Tool ("APT"), developed by the National Institute of Out-of-School Time (NIOST), is an observational instrument designed to measure the aspects of afterschool program quality that research suggests contribute to the 21st century skills, attitudes, and behaviors youth need to be successful…

  15. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  16. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    cover their capabilities. The result of the assessments should make it possible to choose relevant software that will be used throughout the project and also in general use for mooring design of WECs. The report is a part of Work Package 1 : "Task 1.2: Assessment of Available Numerical Tools for Dynamic......This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently...... Mooring Analysis" and "Milestone 1: Acquisition of Selected Numerical Tools" of the project and was produced by Aalborg University in cooperation with Chalmers University of Technology....

  17. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...... and parameterised reward annotations. SBOAT allows the optimisation of these processes by specifying optimisation goals by means of probabilistic control tree logic (PCTL). Optimisation is performed by means of an evolutionary algorithm where stochastic model checking, in the form of the PRISM model checker...... industry and will illustrate the practical applicability of this tool by helping the company analyse and optimise selected workflows....

  18. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  19. Program Understanding: A Reengineering Case for the Transformation Tool Contest

    Directory of Open Access Journals (Sweden)

    Tassilo Horn

    2011-11-01

    Full Text Available In Software Reengineering, one of the central artifacts is the source code of the legacy system in question. In fact, in most cases it is the only definitive artifact, because over the time the code has diverged from the original architecture and design documents. The first task of any reengineering project is to gather an understanding of the system's architecture. Therefore, a common approach is to use parsers to translate the source code into a model conforming to the abstract syntax of the programming language the system is implemented in which can then be subject to querying. Despite querying, transformations can be used to generate more abstract views on the system's architecture. This transformation case deals with the creation of a state machine model out of a Java syntax graph. It is derived from a task that originates from a real reengineering project.

  20. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  1. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  2. Using Conversation Analysis to Improve an Augmented Communication Tool

    NARCIS (Netherlands)

    Koole, Tom; Mak, Pim

    2014-01-01

    An ALS (amyotrofic lateral sclerosis) patient could only move her eyes and interacted through a MyTobii AAC (Augmentative and Alternative Communication) tool with eye-tracking and speech synthesizing technology. Conversation analysis of 75 minutes of video-recorded talk-in-interaction with her

  3. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  4. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    Science.gov (United States)

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively…

  5. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  6. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  7. Separation analysis, a tool for analyzing multigrid algorithms

    Science.gov (United States)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  8. Next Generation Static Software Analysis Tools (Dagstuhl Seminar 14352)

    OpenAIRE

    Cousot, Patrick; Kroening, Daniel; Sinz, Daniel

    2014-01-01

    There has been tremendous progress in static software analysis over the last years with, for example, refined abstract interpretation methods, the advent of fast decision procedures like SAT and SMT solvers, new approaches like software (bounded) model checking or CEGAR, or new problem encodings. We are now close to integrating these techniques into every programmer's toolbox. The aim of the seminar was to bring together developers of software analysis tools and algorithms, including ...

  9. Inclusiveness program - a SWOT analysis

    Science.gov (United States)

    Dósa, M.; Szegő, K.

    2017-09-01

    The Inclusiveness Program was created with the aim to integrate currently under-represented countries into the mainstream of European planetary research. Main stages of the working plan include setting up a database containing all the research institutes and universities where astronomical or geophysical research is carried out. It is necessary to identify their problems and needs. Challenging part of the project is to find exact means that help their work in a sustainable way. Strengths, weaknesses, opportunities and threats of the program were identified based on feedback from the inclusiveness community. Our conclusions, further suggestions are presented.

  10. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  11. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  12. Dispersion analysis and linear error analysis capabilities of the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    Previous error analyses conducted by the Guidance and Dynamics Branch of NASA have used the Guidance Analysis Program (GAP) as the trajectory simulation tool. Plans are made to conduct all future error analyses using the Space Vehicle Dynamics Simulation (SVDS) program. A study was conducted to compare the inertial measurement unit (IMU) error simulations of the two programs. Results of the GAP/SVDS comparison are presented and problem areas encountered while attempting to simulate IMU errors, vehicle performance uncertainties and environmental uncertainties using SVDS are defined. An evaluation of the SVDS linear error analysis capability is also included.

  13. Multiple correspondence analysis as a tool for analysis of large ...

    African Journals Online (AJOL)

    Phone Number: +270739720957. Fax Number: + .... dissimilarity between the frequencies in each cell of a contingency table. ..... WHO, Malaria Rapid Diagnostic Test Performance,. 2009: Geneva ... respondence Analysis 1984: Academic Press. 18. Gifi, A. ... respondence Analysis of Child- Care Students' and. Medical ...

  14. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  15. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  16. The Development of a Humanitarian Health Ethics Analysis Tool.

    Science.gov (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  17. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  18. Supporting secure programming in web applications through interactive static analysis.

    Science.gov (United States)

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  19. Supporting secure programming in web applications through interactive static analysis

    Directory of Open Access Journals (Sweden)

    Jun Zhu

    2014-07-01

    Full Text Available Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  20. Visual DSD: a design and analysis tool for DNA strand displacement systems.

    Science.gov (United States)

    Lakin, Matthew R; Youssef, Simon; Polo, Filippo; Emmott, Stephen; Phillips, Andrew

    2011-11-15

    The Visual DSD (DNA Strand Displacement) tool allows rapid prototyping and analysis of computational devices implemented using DNA strand displacement, in a convenient web-based graphical interface. It is an implementation of the DSD programming language and compiler described by Lakin et al. (2011) with additional features such as support for polymers of unbounded length. It also supports stochastic and deterministic simulation, construction of continuous-time Markov chains and various export formats which allow models to be analysed using third-party tools.

  1. An introduction to TR-X: a simplified tool for standardized analysis

    Energy Technology Data Exchange (ETDEWEB)

    Johns, Russell C [Los Alamos National Laboratory; Waters, Laurie S [Los Alamos National Laboratory; Fallgren, Andrew J [Los Alamos National Laboratory; Ghoreson, Gregory G [UNIV OF TEXAS

    2010-09-09

    TR-X is a multi-platform program that provides a graphical interface to Monte Carlo N-Particle transport (MCNP) and Monte Carlo N-Particle transport eXtended (MCNPX) codes. Included in this interface are tools to reduce the tedium of input file creation, provide standardization of model creation and analysis, and expedite the execution of the created models. TR-X provides tools to make the rapid testing of multiple permutations of these models easier, while also building in standardization that allows multiple solutions to be compared.

  2. Static Analysis of Mobile Programs

    Science.gov (United States)

    2017-02-01

    multiple layers of object-oriented abstractions. Our hypothesis has been that static analysis techniques have reached the point that sound, precise and...important features of STAMP are: • An interface for reading and interpreting DEX bytecode, allowing analysis of libraries in compiled form, both statically...describe a new bottom-up, subset-based, and context-sensitive pointer analysis for Java. The main novelty of our technique is the constraint-based handling

  3. Professional development programs in health promotion: tools and processes to favor new practices.

    Science.gov (United States)

    Torres, Sara; Richard, Lucie; Guichard, Anne; Chiocchio, François; Litvak, Eric; Beaudet, Nicole

    2017-06-01

    Developing innovative interventions that are in sync with a health promotion paradigm often represents a challenge for professionals working in local public health organizations. Thus, it is critical to have both professional development programs that favor new practices and tools to examine these practices. In this case study, we analyze the health promotion approach used in a pilot intervention addressing children's vulnerability that was developed and carried out by participants enrolled in a public health professional development program. More specifically, we use a modified version of Guichard and Ridde's (Une grille d'analyse des actions pour lutter contre les inégalités sociales de santé. In Potvin, L., Moquet, M.-J. and Jones, C. M. (eds), Réduire les Inégalités Sociales en Santé. INPES, Saint-Denis Cedex, pp. 297-312, 2010) analytical grid to assess deductively the program participants' use of health promotion practices in the analysis and planning, implementation, evaluation, sustainability and empowerment phases of the pilot intervention. We also seek evidence of practices involving (empowerment, participation, equity, holism, an ecological approach, intersectorality and sustainability) in the intervention. The results are mixed: our findings reveal evidence of the application of several dimensions of health promotion (equity, holism, an ecological approach, intersectorality and sustainability), but also a lack of integration of two key dimensions; that is, empowerment and participation, during various phases of the pilot intervention. These results show that the professional development program is associated with the adoption of a pilot intervention integrating multiple but not all dimensions of health promotion. We make recommendations to facilitate a more complete integration. This research also shows that the Guichard and Ridde grid proves to be a thorough instrument to document the practices of participants. © The Author 2015. Published by

  4. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  5. Evaluating Effectiveness of Pair Programming as a Teaching Tool in Programming Courses

    Science.gov (United States)

    Faja, Silvana

    2014-01-01

    This study investigates the effectiveness of pair programming on student learning and satisfaction in introductory programming courses. Pair programming, used in the industry as a practice of an agile development method, can be adopted in classroom settings to encourage peer learning, increase students' social skills, and enhance student…

  6. MethSurv: a web tool to perform multivariable survival analysis using DNA methylation data.

    Science.gov (United States)

    Modhukur, Vijayachitra; Iljasenko, Tatjana; Metsalu, Tauno; Lokk, Kaie; Laisk-Podar, Triin; Vilo, Jaak

    2017-12-21

    To develop a web tool for survival analysis based on CpG methylation patterns. We utilized methylome data from 'The Cancer Genome Atlas' and used the Cox proportional-hazards model to develop an interactive web interface for survival analysis. MethSurv enables survival analysis for a CpG located in or around the proximity of a query gene. For further mining, cluster analysis for a query gene to associate methylation patterns with clinical characteristics and browsing of top biomarkers for each cancer type are provided. MethSurv includes 7358 methylomes from 25 different human cancers. The MethSurv tool is a valuable platform for the researchers without programming skills to perform the initial assessment of methylation-based cancer biomarkers.

  7. [Comparative analysis of early diagnostic tools for breast cancer].

    Science.gov (United States)

    Shen, Song-jie; Sun, Qiang; Xu, Ya-li; Zhou, Yi-dong; Guan, Jing-hong; Mao, Feng; Lin, Yan; Wang, Xue-jing; Han, Shao-mei

    2012-11-01

    Mammography is the principle imaging modality used for early diagnosis of breast cancer in Western countries. It has not been well-established whether this Western diagnostic modality is adoptable for Chinese women. The aim of this study was to evaluate the respective accuracy of the common diagnostic tools for breast cancer including history-taking, physical examination, ultrasound and mammography. Clinical presentation and investigations for consecutive patients undergoing history-taking, physical examination, ultrasound, mammography and pathological assessment at Peking Union Medical College Hospital were prospectively recorded between April 2010 and September 2011. Breast cancer high-risk factors acquired by history-taking were input into the risk assessment model established previously by Eleventh Five Year Key Programs for Science and Technology Development of China (Grant No. 2006BAI02A09) and classified into low-, medium-, high- and extremely high-risk groups. The low- and medium-risk groups were defined as test negative, while the high- and extremely high-risk groups were defined as test positive. Each mammogram and ultrasound was reported prospectively using a five-point reporting scale of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS). Clinical data were compared with pathological findings. Sensitivity, specificity, positive predictive value (PRV), negative predictive value (NPV) and accuracy of respective diagnostic methods were calculated and compared. The patients were divided into two groups, above and below 50 years of age for subgroup analysis. A total of 1468 patients (1475 breast lesions) constituted the study population. The median age was 44 (range 13 - 92) years. Five hundred and fifty-one patients were diagnosed as breast cancer. The median age at diagnosis was 51 years and breast cancer peaked in the age group of 40 - 60 years. The sensitivity of risk assessment model, physical examination

  8. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  9. Control system design and analysis using the INteractive Controls Analysis (INCA) program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.

  10. Programming effort analysis of the ELLPACK language

    Science.gov (United States)

    Rice, J. R.

    1978-01-01

    ELLPACK is a problem statement language and system for elliptic partial differential equations which is implemented by a FORTRAN preprocessor. ELLPACK's principal purpose is as a tool for the performance evaluation of software. However, it is used here as an example with which to study the programming effort required for problem solving. It is obvious that problem statement languages can reduce programming effort tremendously; the goal is to quantify this somewhat. This is done by analyzing the lengths and effort (as measured by Halstead's software science technique) of various approaches to solving these problems.

  11. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  12. AMIDE: a free software tool for multimodality medical image analysis.

    Science.gov (United States)

    Loening, Andreas Markus; Gambhir, Sanjiv Sam

    2003-07-01

    Amide's a Medical Image Data Examiner (AMIDE) has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI) and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  13. SIMAT: GC-SIM-MS data analysis tool.

    Science.gov (United States)

    Ranjbar, Mohammad R Nezami; Di Poto, Cristina; Wang, Yue; Ressom, Habtom W

    2015-08-19

    Gas chromatography coupled with mass spectrometry (GC-MS) is one of the technologies widely used for qualitative and quantitative analysis of small molecules. In particular, GC coupled to single quadrupole MS can be utilized for targeted analysis by selected ion monitoring (SIM). However, to our knowledge, there are no software tools specifically designed for analysis of GC-SIM-MS data. In this paper, we introduce a new R/Bioconductor package called SIMAT for quantitative analysis of the levels of targeted analytes. SIMAT provides guidance in choosing fragments for a list of targets. This is accomplished through an optimization algorithm that has the capability to select the most appropriate fragments from overlapping chromatographic peaks based on a pre-specified library of background analytes. The tool also allows visualization of the total ion chromatograms (TIC) of runs and extracted ion chromatograms (EIC) of analytes of interest. Moreover, retention index (RI) calibration can be performed and raw GC-SIM-MS data can be imported in netCDF or NIST mass spectral library (MSL) formats. We evaluated the performance of SIMAT using two GC-SIM-MS datasets obtained by targeted analysis of: (1) plasma samples from 86 patients in a targeted metabolomic experiment; and (2) mixtures of internal standards spiked in plasma samples at varying concentrations in a method development study. Our results demonstrate that SIMAT offers alternative solutions to AMDIS and MetaboliteDetector to achieve accurate detection of targets and estimation of their relative intensities by analysis of GC-SIM-MS data. We introduce a new R package called SIMAT that allows the selection of the optimal set of fragments and retention time windows for target analytes in GC-SIM-MS based analysis. Also, various functions and algorithms are implemented in the tool to: (1) read and import raw data and spectral libraries; (2) perform GC-SIM-MS data preprocessing; and (3) plot and visualize EICs and TICs.

  14. Operatory-logistic analysis method of computational tools

    OpenAIRE

    Alejandra Behar, Patricia; Bortolozo Pivoto, Deise; Santos da Silveira, Fabiana

    2012-01-01

    The present contribution reports results obtained by the group which investigate the Operative Analysis of Computational Environments. This group belongs to the Nucleous of Digital Technology applied in Education (NUTED), of the Education School of the Federal University of Rio Grande do Sul. Basically we use the Piagetian Theory, in particular, the logical-operatory model, to construct a methodology to analyse computational tools. Therefore, we have to define the basic concepts in a fr...

  15. Keel A Data Mining Tool: Analysis With Genetic

    OpenAIRE

    Ms. Pooja Mittal; Manju Narwal

    2012-01-01

    This work is related to the KEEL (Knowledge Extraction basedon Evolutionary Learning) tool, an open source software thatsupports data management and provides a platform for theanalysis of evolutionary learning for Data Mining problems ofdifferent kinds including as regression, classification,unsupervised learning. It includes a big collection of evolutionarylearning algorithms based on different approaches: Pittsburgh,Michigan. It empowers the user to perform complete analysis ofany genetic f...

  16. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  17. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  18. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  19. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  20. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  1. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    Science.gov (United States)

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  2. An Object Oriented Programming Tool for Optimal Management of Water Systems under Uncertainty by use of Stochastic Dual Dynamic Programming

    Science.gov (United States)

    Raso, Luciano; Dorchies, David; Malaterre, Pierre-Olivier

    2015-04-01

    We developed an Objective Oriented Programming (OOP) tool for optimal management of complex water systems by use of Stochastic Dual Dynamic Programming (SDDP). OOP is a powerful programming paradigm. OOP minimizes code redundancies, making code modification and maintenance very effective. This is especially welcome in research, in which, often, code must be modified to meet new requirements that were not initially considered. SDDP is an advanced method for optimal operation of complex dynamic systems under uncertainty. SDDP can deal with large and complex systems, such as a multi-reservoir system. The objective of this tool is making SDDP usable for Water Management Analysts. Thanks to this tool, the Analyst can bypass the SDDP programming complexity, and his/her task is simplified to the definition of system elements, topology and objectives, and experiments characteristics. In this tool, the main classes are: Experiment, System, Element, and Objective. Experiments are run on a system. A system is made of many elements interconnected among them. Class Element is made of the following sub-classes: (stochastic) hydrological scenario, (deterministic) water demand scenario, reservoir, river reach, off-take, and irrigation basin. Objectives are used in the optimization procedure to find the optimal operational rules, for a given system and experiment. OOP flexibility allows the Water Management Analyst to extend easily existing classes in order to answer his/her specific research questions. The tool is implemented in Python, and will be initially tested on two applications: the Senegal River water system, in West Africa, and the Seine River, in France.

  3. TANGO: a generic tool for high-throughput 3D image analysis for studying nuclear organization.

    Science.gov (United States)

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2013-07-15

    The cell nucleus is a highly organized cellular organelle that contains the genetic material. The study of nuclear architecture has become an important field of cellular biology. Extracting quantitative data from 3D fluorescence imaging helps understand the functions of different nuclear compartments. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here, we describe Tools for Analysis of Nuclear Genome Organization (TANGO), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a coherent framework allowing biologists to perform the complete analysis process of 3D fluorescence images by combining two environments: ImageJ (http://imagej.nih.gov/ij/) for image processing and quantitative analysis and R (http://cran.r-project.org) for statistical processing of measurement results. It includes an intuitive user interface providing the means to precisely build a segmentation procedure and set-up analyses, without possessing programming skills. TANGO is a versatile tool able to process large sets of images, allowing quantitative study of nuclear organization. TANGO is composed of two programs: (i) an ImageJ plug-in and (ii) a package (rtango) for R. They are both free and open source, available (http://biophysique.mnhn.fr/tango) for Linux, Microsoft Windows and Macintosh OSX. Distribution is under the GPL v.2 licence. thomas.boudier@snv.jussieu.fr Supplementary data are available at Bioinformatics online.

  4. A systematic writing program as a tool in the grief process: part 1

    OpenAIRE

    Furnes, Bodil; Dysvik, Elin

    2010-01-01

    Bodil Furnes, Elin DysvikUniversity of Stavanger, Faculty of Social Sciences, Department of Health Studies, Stavanger, NorwayObjective: The basic aim of this paper is to suggest a flexible and individualized writing program as a tool for use during the grief process of bereaved adults.Methods: An open, qualitative approach following distinct steps was taken to gain a broad perspective on the grief and writing processes, as a platform for the writing program.Results: Following several systemat...

  5. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  6. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  7. Design and Application of the Exploration Maintainability Analysis Tool

    Science.gov (United States)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  8. Diffusion of Latent Semantic Analysis as a Research Tool: A Social Network Analysis Approach

    OpenAIRE

    Tonta, Yaşar; DARVISH, HAMID

    2010-01-01

    Latent semantic analysis (LSA) is a relatively new research tool with a wide range of applications in different fields ranging from discourse analysis to cognitive science, from information retrieval to machine learning and so on. In this paper, we chart the develop- ment and diffusion of LSA as a research tool using social network analysis (SNA) approach that reveals the social structure of a discipline in terms of collaboration among scientists. Using Thomson Reuters’ Web of Science (WoS), ...

  9. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry......, as creativity training is a tool that can contribute to enhancement of organizational creativity and subsequently innovation. However, to answer the question, future studies of creativity training programs need to be carefully designed to contribute to a more transparent landscape. Thus this paper proposes...

  10. Constructing functional programs for grammar analysis problems

    NARCIS (Netherlands)

    Jeuring, J.T.; Swierstra, S.D.

    1995-01-01

    This paper discusses the derivation of functional programs for grammar analysis problems, such as the Empty problem and the Reachable problem. Grammar analysis problems can be divided into two classes: top-down problems such as Follow and Reachable, which are described in terms of the contexts of

  11. A survey of tools for variant analysis of next-generation genome sequencing data

    Science.gov (United States)

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  12. Energy analysis program, FY 1979

    Science.gov (United States)

    1980-04-01

    Energy analysis attempts to understand the volitional choices of energy use and supply available to human society, and the multi-faceted consequences of choosing any one of them. Topics deal with economic impacts; assessments of regional issues and impacts; air quality evaluation; institutional and political issues in California power plant siting; assessment of environmental standards; water issues; characterization of aquatic systems dissolved oxygen profiles; modeling; computer-generated interactive graphics; energy assessment in Hawaii; solar energy in communities; utilities solar financial data; population impacts of geothermal development; energy conservation in colleges and residential sectors; energy policy; decision making; building energy performance standards; standards for residential appliances; and impact of energy performance standards on demand for peak electrical energy.

  13. Algorithms and programming tools for image processing on the MPP, introduction. Thesis

    Science.gov (United States)

    1985-01-01

    The programming tools and parallel algorithms created for the Massively Parallel Processor (MPP) located at the NASA Goddard Space Center are discussed. A user-friendly environment for high level language parallel algorithm development was developed. The issues involved in implementing certain algorithms on the MPP were researched. The expected results were compared with the actual results.

  14. A New Internet Tool for Automatic Evaluation in Control Systems and Programming

    Science.gov (United States)

    Munoz de la Pena, D.; Gomez-Estern, F.; Dormido, S.

    2012-01-01

    In this paper we present a web-based innovative education tool designed for automating the collection, evaluation and error detection in practical exercises assigned to computer programming and control engineering students. By using a student/instructor code-fusion architecture, the conceptual limits of multiple-choice tests are overcome by far.…

  15. Basic Botany On-Line: A Training Tool for the Master Gardener Program.

    Science.gov (United States)

    VanDerZanden, Ann Marie; Rost, Bob; Eckel, Rick

    2002-01-01

    A noncredit, online training module on botany was offered to participants in the Oregon Master Gardener program. The 48 participants felt the module was a useful training tool. They also noted that the convenience of completing the material at their own pace and during a time that fit into their schedule. (SK)

  16. Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck

    Science.gov (United States)

    Yan, Jerry C.; Bailey, David (Technical Monitor)

    1996-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.

  17. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  18. TIME ANALYSIS ACCORDING TO PART PROGRAMS ON A CNC VERTICAL MACHINING CENTER

    Directory of Open Access Journals (Sweden)

    Ahmet Murat PİNAR

    2002-01-01

    Full Text Available In this study, a program examining the CNC programs in the control unit of Dyna Myte 2900 Vertical Machining Center and calculating the machining time and rapid movement time of the cutting tools has been developed. The workpiece program to be examined is transferred to CNC code editor by the user manually, by a computer file with a diskette, or through hard disk or the machine tool. By examining all the movements of the cutting tools, detailed machining time, rapid movement time or total time is served to the user. So that, an important part of workpiece cost analysis information is provided.

  19. ISAC - A tool for aeroservoelastic modeling and analysis. [Interaction of Structures, Aerodynamics, and Control

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood T.

    1993-01-01

    This paper discusses the capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrate some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  20. Security Transition Program Office (STPO), technology transfer of the STPO process, tools, and techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, J.T.; Forslund, C.R.J.; Underwood, J.A.

    1994-09-01

    In 1990, with the transition from a defense mission to environmental restoration, the U.S. Department of Energy`s (DOE`s) Hanford Site began a significant effort to diagnose, redesign, and implement new safeguards and security (SAS) processes. In 1992 the Security Transition Program Office (STPO) was formed to address the sweeping changes that were being identified. Comprised of SAS and other contractor staff with extensive experience and supported by staff experienced in organizational analysis and work process redesign, STPO undertook a series of tasks designed to make fundamental changes to SAS processes throughout the Hanford Site. The goal of STPO is to align the SAS work and organization with the new Site mission. This report describes the key strategy, tools, methods, and techniques used by STPO to change SAS processes at Hanford. A particular focus of this review is transferring STPO`s experience to other DOE sites and federal agency efforts: that is, to extract, analyze, and provide a critical review of the approach, tools, and techniques used by STPO that will be useful to other DOE sites and national laboratories in transitioning from a defense production mode to environmental restoration and other missions. In particular, what lessons does STPO provide as a pilot study or model for implementing change in other transition activities throughout the DOE complex? More broadly, what theoretical and practical contributions do DOE transition efforts, such as STPO, provide to federal agency streamlining efforts and attempts to {open_quotes}reinvent{close_quotes} government enterprises in the public sector? The approach used by STPO should provide valuable information to those examining their own processes in light of new mission requirements.

  1. A systematic writing program as a tool in the grief process: part 1.

    Science.gov (United States)

    Furnes, Bodil; Dysvik, Elin

    2010-12-06

    The basic aim of this paper is to suggest a flexible and individualized writing program as a tool for use during the grief process of bereaved adults. An open, qualitative approach following distinct steps was taken to gain a broad perspective on the grief and writing processes, as a platform for the writing program. Following several systematic methodological steps, we arrived at suggestions for the initiation of a writing program and its structure and substance, with appropriate guidelines. We believe that open and expressive writing, including free writing and focused writing, may have beneficial effects on a person experiencing grief. These writing forms may be undertaken and systematized through a writing program, with participation in a grief writing group and with diary writing, to achieve optimal results. A structured writing program might be helpful in promoting thought activities and as a tool to increase the coherence and understanding of individuals in the grief process. Our suggested program may also be a valuable guide to future program development and research.

  2. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  3. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  4. HIDRA-MAT: A Material Analysis Tool for Fusion Devices

    Science.gov (United States)

    Andruczyk, Daniel; Rizkallah, Rabel; Bedoya, Felipe; Kapat, Aveek; Schamis, Hanna; Allain, Jean Paul

    2017-10-01

    The former WEGA stellarator which is now operating as HIDRA at the University of Illinois will be almost exclusively used to study the intimate relationship between the plasma interacting with surfaces of different materials. A Material Analysis Tool (HIDRA-MAT) is being designed and will be built based on the successful Material Analysis and Particle Probe (MAPP) which is currently used on NSTX-U at PPPL. This will be an in-situ material diagnostic probe, meaning that all analysis can be done without breaking vacuum. This allows surface changes to be studied in real-time. HIDRA-MAT will consist of several in-situ diagnostics including Langmuir probes (LP), Thermal Desorption Spectroscopy (TDS), X-ray Photo Spectroscopy (XPS) and Ion Scattering Spectroscopy (ISS). This presentation will outline the HIDRA-MAT diagnostic and initial design, as well as its integration into the HIDRA system.

  5. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... and apply optimizations thereto. Based on experiments, the code generated by the tool can be 3-256X faster than code generated by an OpenACC compiler, 4-37X faster than optimized CPU code, and attain up to 25% of peak performance of the GPU. We found that by using pattern-matching rules, many...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  6. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  7. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  8. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  9. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  10. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  11. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  12. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...... the quantitative phylogenetic and functional compositions of metagenomes, to compare compositions of multiple metagenomes and to produce intuitive visual representations of such analyses. AVAILABILITY: SmashCommunity is freely available at http://www.bork.embl.de/software/smash CONTACT: bork@embl.de....

  13. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  14. Energy life-cycle analysis modeling and decision support tool

    Energy Technology Data Exchange (ETDEWEB)

    Hoza, M.; White, M.E.

    1993-06-01

    As one of DOE`s five multi-program national laboratories, Pacific Northwest Laboratory (PNL) develops and deploys technology for national missions in energy and the environment. The Energy Information Systems Group, within the Laboratory`s Computer Sciences Department, focuses on the development of the computational and data communications infrastructure and automated tools for the Transmission and Distribution energy sector and for advanced process engineering applications. The energy industry is being forced to operate in new ways and under new constraints. It is in a reactive mode, reacting to policies and politics, and to economics and environmental pressures. The transmission and distribution sectors are being forced to find new ways to maximize the use of their existing infrastructure, increase energy efficiency, and minimize environmental impacts, while continuing to meet the demands of an ever increasing population. The creation of a sustainable energy future will be a challenge for both the soft and hard sciences. It will require that we as creators of our future be bold in the way we think about our energy future and aggressive in its development. The development of tools to help bring about a sustainable future will not be simple either. The development of ELCAM, for example, represents a stretch for the computational sciences as well as for each of the domain sciences such as economics, which will have to be team members.

  15. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    Science.gov (United States)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  16. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0206

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  17. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0223

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  18. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0204

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  19. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0227

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  20. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0255

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  1. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0226

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  2. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0215

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  3. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0207

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  4. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0214

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  5. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0209

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  6. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0212

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  7. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0231

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  8. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0222

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  9. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0201

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  10. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0215

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  11. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0253

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  12. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0208

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  13. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0217

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  14. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0210

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  15. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0219

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  16. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0221

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  17. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0209

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  18. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0220

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  19. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0217

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  20. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0224

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  1. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0216

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.; Lytle, John

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  2. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0205

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  3. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0228

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  4. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0213

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  5. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0221

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  6. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0213

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  7. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0212

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  8. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0225

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  9. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0216

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  10. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0211

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  11. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0210

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  12. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0230

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  13. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0214

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  14. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0202

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  15. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0218

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  16. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_IM_KP_0203

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  17. International Solar-Terrestrial Program Key Parameter Visualization Tool Data: USA_NASA_DDF_ISTP_KP_0229

    Science.gov (United States)

    Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.

    2001-01-01

    The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.

  18. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  19. Development of the Power Simulation Tool for Energy Balance Analysis of Nanosatellites

    Directory of Open Access Journals (Sweden)

    Eun-Jung Kim

    2017-09-01

    Full Text Available The energy balance in a satellite needs to be designed properly for the satellite to safely operate and carry out successive missions on an orbit. In this study, an analysis program was developed using the MATLABⓇ graphic user interface (GUI for nanosatellites. This program was used in a simulation to confirm the generated power, consumed power, and battery power in the satellites on the orbit, and its performance was verified with applying different satellite operational modes and units. For data transmission, STKⓇ-MATLABⓇ connectivity was used to send the generated power from STKⓇ to MATLABⓇ automatically. Moreover, this program is general-purpose; therefore, it can be applied to nanosatellites that have missions or shapes that are different from those of the satellites in this study. This power simulation tool could be used not only to calculate the suitable power budget when developing the power systems, but also to analyze the remaining energy balance in the satellites.

  20. Development of the Power Simulation Tool for Energy Balance Analysis of Nanosatellites

    Science.gov (United States)

    Kim, Eun-Jung; Sim, Eun-Sup; Kim, Hae-Dong

    2017-09-01

    The energy balance in a satellite needs to be designed properly for the satellite to safely operate and carry out successive missions on an orbit. In this study, an analysis program was developed using the MATLABⓇ graphic user interface (GUI) for nanosatellites. This program was used in a simulation to confirm the generated power, consumed power, and battery power in the satellites on the orbit, and its performance was verified with applying different satellite operational modes and units. For data transmission, STKⓇ-MATLABⓇ connectivity was used to send the generated power from STKⓇ to MATLABⓇ automatically. Moreover, this program is general-purpose; therefore, it can be applied to nanosatellites that have missions or shapes that are different from those of the satellites in this study. This power simulation tool could be used not only to calculate the suitable power budget when developing the power systems, but also to analyze the remaining energy balance in the satellites.

  1. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  2. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  3. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  4. Net energy analysis: Powerful tool for selecting electric power options

    Science.gov (United States)

    Baron, S.

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  5. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  6. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  7. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  8. Factor analysis of the transcultural self-efficacy tool (TSET).

    Science.gov (United States)

    Jeffreys, Marianne R; Dogan, Enis

    2010-01-01

    he factor structure of the Transcultural Self-Efficacy Tool (TSET) was analyzed using data from 272 culturally diverse undergraduate nursing students. The TSET is a questionnaire designed to measure students' confidence for performing general transcultural nursing skills among diverse client populations. Using the most recent imputation techniques for missing data, researchers demonstrate how common exploratory factor analysis (CEFA)--(as opposed to principal components analysis)--can (and should be) used in examining the factorial composition of the tool. Standard errors for factor loadings were computed and utilized in deciding whether a given item loaded significantly on a factor and whether the difference between the factor loadings of two or more items on the same factor were statistically significant. The CEFA, comprised of 69 of the 83 items, yielded four factors--"Knowledge and Understanding'," "Interview," "Awareness, Acceptance, and Appreciation," and "Recognition"--with internal consistency ranging from .94 to .98. Reliability of the total instrument was .99. It was concluded that the present CEFA study continues to support that the TSET assesses the multidimensional nature of transcultural self-efficacy while also differentiating between three types of learning: cognitive, practical, and affective. The benefits of this support allow the researcher/ educator to move beyond mere assessment to the design, implementation, and evaluation of diagnostic-prescriptive teaching strategies for cultural competence education.

  9. A tool for finite element deflection analysis of wings

    Energy Technology Data Exchange (ETDEWEB)

    Carlen, Ingemar

    2005-03-01

    A first version (ver 0.1) of a new tool for finite element deflection analysis of wind turbine blades is presented. The software is called SOLDE (SOLid blaDE), and was developed as a Matlab shell around the free finite element codes CGX (GraphiX - pre-processor), and CCX (CrunchiX - solver). In the present report a brief description of SOLDE is given, followed by a basic users guide. The main features of SOLDE are: - Deflection analysis of wind turbine blades, including 3D effects and warping. - Accurate prediction of eigenmodes and eigenfrequencies. - Derivation of 2-node slender elements for use in various aeroelastic analyses. The main differences between SOLDE and other similar tools can be summarised as: - SOLDE was developed without a graphical user interface or a traditional text file input deck. Instead the input is organised as Matlab data structures that have to be formed by a user provided pre-processor. - SOLDE uses a solid representation of the geometry instead of a thin shell approximation. The benefit is that the bending-torsion couplings will automatically be correctly captured. However, a drawback with the current version is that the equivalent orthotropic shell idealisation violates the local bending characteristics, which makes the model useless for buckling analyses. - SOLDE includes the free finite element solver CCX, and thus no expensive commercial software (e.g. Ansys, or Nastran) is required to produce results.

  10. metaSNV: A tool for metagenomic strain level analysis.

    Science.gov (United States)

    Costea, Paul Igor; Munch, Robin; Coelho, Luis Pedro; Paoli, Lucas; Sunagawa, Shinichi; Bork, Peer

    2017-01-01

    We present metaSNV, a tool for single nucleotide variant (SNV) analysis in metagenomic samples, capable of comparing populations of thousands of bacterial and archaeal species. The tool uses as input nucleotide sequence alignments to reference genomes in standard SAM/BAM format, performs SNV calling for individual samples and across the whole data set, and generates various statistics for individual species including allele frequencies and nucleotide diversity per sample as well as distances and fixation indices across samples. Using published data from 676 metagenomic samples of different sites in the oral cavity, we show that the results of metaSNV are comparable to those of MIDAS, an alternative implementation for metagenomic SNV analysis, while data processing is faster and has a smaller storage footprint. Moreover, we implement a set of distance measures that allow the comparison of genomic variation across metagenomic samples and delineate sample-specific variants to enable the tracking of specific strain populations over time. The implementation of metaSNV is available at: http://metasnv.embl.de/.

  11. PLATO A Program Library for the Analysis of 4D Nonlinear Transverse Motion

    CERN Document Server

    Giovannozzi, Massimo; Bazzani, A; Bartolini, R

    1998-01-01

    The PLATO (Perturbative Lattice Analysis and Tracking tOols) program, a program library for analyzing four-dimensional betatronic motion in circular particle accelerators is presented. The routines included in this library provide both the resonant and the nonresonant perturbative series that approximate nonlinear motion (normal forms); standard numerical tools such as the Lyapunov exponent, frequency analysis and evaluation of the dynamic aperture are also available. To ensure the highest flexibility, the code is fully compatible with standard tracking programs commonly used in the accelerator physics community.

  12. Utility Green Pricing Programs: A Statistical Analysis of Program Effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, R.; Olson, S.; Bird, L.; Swezey, B.

    2004-02-01

    This report analyzes actual utility green pricing program data to provide further insight into which program features might help maximize both customer participation in green pricing programs and the amount of renewable energy purchased by customers in those programs.

  13. Intrasystem Analysis Program (IAP) Model Improvement.

    Science.gov (United States)

    1982-02-01

    input impedance of ideal, lossless dipole antennas . Both the biconical and cylindri- cal dipoles, illustrated in Figure 2-2, were studied. a a h h ’I...on reverse ardde it neessary and 1~11’fff by block ambetf) Systems EMC Anaysis, Intrasystem Analysis Code, Frequency Le.- Antenna Models, Nonlinear...Receptor Models, Waveform Sensitive Rezept.;t, Spectral Models, Intrasystem EMC Analysis Program, Antenna Matching Factor, Trans- mission Line Loss

  14. Counter Trafficking System Development "Analysis Training Program"

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Dennis C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-12-01

    This document will detail the training curriculum for the Counter-Trafficking System Development (CTSD) Analysis Modules and Lesson Plans are derived from the United States Military, Department of Energy doctrine and Lawrence Livermore National Laboratory (LLNL), Global Security (GS) S Program.

  15. Children's Animated TV Programs: A Content Analysis

    Science.gov (United States)

    Lambert, E. Beverley; Clancy, Susan

    2004-01-01

    This study describes the use of content analysis to develop a framework for analysing children's animated television programs (in this case, "Bob the Builder") and as such represents the initial stage of a larger project. Results indicate this popular TV series for preschoolers presents contradictory social messages about the roles of…

  16. The enhanced forest inventory and analysis program

    Science.gov (United States)

    Ronald E. McRoberts

    2005-01-01

    The Agricultural Research, Extension, and Education Reform Act of 1998 (Public Law 105–185), also known as the 1998 Farm Bill, prescribed conceptual changes in approaches to forest inventories conducted by the Forest Inventory and Analysis (FIA) Program of the U.S. Department of Agriculture (USDA) Forest Service. Realization of these conceptual changes required...

  17. ELECTRA © Launch and Re-Entry Safety Analysis Tool

    Science.gov (United States)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA© tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA© project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA© tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA© consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRA©’s development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for

  18. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  19. A Software Tool to Visualize Verbal Protocols to Enhance Strategic and Metacognitive Abilities in Basic Programming

    Directory of Open Access Journals (Sweden)

    Carlos A. Arévalo

    2011-07-01

    Full Text Available Learning to program is difficult for many first year undergraduate students. Instructional strategies of traditional programming courses tend to focus on syntactic issues and assigning practice exercises using the presentation-examples-practice formula and by showing the verbal and visual explanation of a teacher during the “step by step” process of writing a computer program. Cognitive literature regarding the mental processes involved in programming suggests that the explicit teaching of certain aspects such as mental models, strategic knowledge and metacognitive abilities, are critical issues of how to write and assemble the pieces of a computer program. Verbal protocols are often used in software engineering as a technique to record the short term cognitive process of a user or expert in evaluation or problem solving scenarios. We argue that verbal protocols can be used as a mechanism to explicitly show the strategic and metacognitive process of an instructor when writing a program. In this paper we present an Information System Prototype developed to store and visualize worked examples derived from transcribed verbal protocols during the process of writing introductory level programs. Empirical data comparing the grades obtained by two groups of novice programming students, using ANOVA, indicates a statistically positive difference in performance in the group using the tool, even though these results still cannot be extrapolated to general population, given the reported limitations of this study.

  20. Designing Abstractions for JavaScript Program Analysis

    DEFF Research Database (Denmark)

    Andreasen, Esben Sparre

    JavaScript is a widely used dynamic programming language. What started out as a client-side scripting language for browsers, is now used for large applications in many different settings. As for other dynamic languages, JavaScript makes it easy to write programs quickly without being constrained...... by the language, and programmers exploit that power to write highly dynamic programs. Automated tools for helping programmers and optimizing programs are used successfully for many programming languages. Unfortunately, the automated tools for JavaScript are not as good as for other programming languages....... The program analyses, that the automated tools are built upon, are poorly suited to deal with the highly dynamic nature of JavaScript programs. The lack of language restrictions on the programmer are detrimental to the quality of program analyses for JavaScript. The aim of this dissertation is to address...

  1. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    Science.gov (United States)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  2. The Cornell Cooperative Extension Statewide Data Collection System: An Online Data Collection Tool for Parent Education Programs

    Science.gov (United States)

    Kopko, Kimberly; Dunifon, Rachel

    2012-01-01

    The Statewide Data Collection System for Parent Education Programs is an online tool for collecting statewide data on Cornell Cooperative Extension (CCE) parenting education programs. The process of the development and use of this data collection tool are provided as a guide to Extension systems. Results for data entered between March 2009 and…

  3. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  4. The FTS atomic spectrum tool (FAST) for rapid analysis of line spectra

    Science.gov (United States)

    Ruffoni, M. P.

    2013-07-01

    The FTS Atomic Spectrum Tool (FAST) is an interactive graphical program designed to simplify the analysis of atomic emission line spectra obtained from Fourier transform spectrometers. Calculated, predicted and/or known experimental line parameters are loaded alongside experimentally observed spectral line profiles for easy comparison between new experimental data and existing results. Many such line profiles, which could span numerous spectra, may be viewed simultaneously to help the user detect problems from line blending or self-absorption. Once the user has determined that their experimental line profile fits are good, a key feature of FAST is the ability to calculate atomic branching fractions, transition probabilities, and oscillator strengths-and their uncertainties-which is not provided by existing analysis packages. Program SummaryProgram title: FAST: The FTS Atomic Spectrum Tool Catalogue identifier: AEOW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 293058 No. of bytes in distributed program, including test data, etc.: 13809509 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-based systems. Operating system: Linux/Unix/Windows. RAM: 8 MB minimum. About 50-200 MB for a typical analysis. Classification: 2.2, 2.3, 21.2. Nature of problem: Visualisation of atomic line spectra including the comparison of theoretical line parameters with experimental atomic line profiles. Accurate intensity calibration of experimental spectra, and the determination of observed relative line intensities that are needed for calculating atomic branching fractions and oscillator strengths. Solution method: FAST is centred around a graphical interface, where a user may view sets of experimental line profiles and compare

  5. Teaching Joint-Level Robot Programming with a New Robotics Software Tool

    Directory of Open Access Journals (Sweden)

    Fernando Gonzalez

    2017-12-01

    Full Text Available With the rising popularity of robotics in our modern world there is an increase in the number of engineering programs that offer the basic Introduction to Robotics course. This common introductory robotics course generally covers the fundamental theory of robotics including robot kinematics, dynamics, differential movements, trajectory planning and basic computer vision algorithms commonly used in the field of robotics. Joint programming, the task of writing a program that directly controls the robot’s joint motors, is an activity that involves robot kinematics, dynamics, and trajectory planning. In this paper, we introduce a new educational robotics tool developed for teaching joint programming. The tool allows the student to write a program in a modified C language that controls the movement of the arm by controlling the velocity of each joint motor. This is a very important activity in the robotics course and leads the student to gain knowledge of how to build a robotic arm controller. Sample assignments are presented for different levels of difficulty.

  6. Implementation of a tool to modify behavior in a chronic disease management program.

    Science.gov (United States)

    Gillespie, Nicole D; Lenz, Thomas L

    2011-01-01

    Chronic diseases like diabetes, hypertension, and dyslipidemia continue to be a significant burden on the US health care system. As a result, many healthcare providers are implementing strategies to prevent the incidence of heart disease and other chronic conditions. Among these strategies are proper drug therapy and lifestyle modifications. Behavior change is often the rate-limiting step in the prevention and maintenance of lifestyle modifications. The purpose of this paper is to describe a tool used to guide the progression and assess the effectiveness of a cardiovascular risk reduction program. The tool uses the Transtheoretical Model of Behavior Change to determine the readiness and confidence to change specific lifestyle behaviors pertinent to cardiovascular health. The tool aids the practitioner in developing a patient-centered plan to implement and maintain lifestyle changes and can be tailored to use in any situation requiring a behavior change on the part of the patient.

  7. A Proposal for a Standard Interface Between Monte Carlo Tools And One-Loop Programs

    Energy Technology Data Exchange (ETDEWEB)

    Binoth, T.; /Edinburgh U.; Boudjema, F.; /Annecy, LAPP; Dissertori, G.; Lazopoulos, A.; /Zurich, ETH; Denner, A.; /PSI, Villigen; Dittmaier, S.; /Freiburg U.; Frederix, R.; Greiner, N.; Hoeche, Stefan; /Zurich U.; Giele, W.; Skands, P.; Winter, J.; /Fermilab; Gleisberg, T.; /SLAC; Archibald, J.; Heinrich, G.; Krauss, F.; Maitre, D.; /Durham U., IPPP; Huber, M.; /Munich, Max Planck Inst.; Huston, J.; /Michigan State U.; Kauer, N.; /Royal Holloway, U. of London; Maltoni, F.; /Louvain U., CP3 /Milan Bicocca U. /INFN, Turin /Turin U. /Granada U., Theor. Phys. Astrophys. /CERN /NIKHEF, Amsterdam /Heidelberg U. /Oxford U., Theor. Phys.

    2011-11-11

    Many highly developed Monte Carlo tools for the evaluation of cross sections based on tree matrix elements exist and are used by experimental collaborations in high energy physics. As the evaluation of one-loop matrix elements has recently been undergoing enormous progress, the combination of one-loop matrix elements with existing Monte Carlo tools is on the horizon. This would lead to phenomenological predictions at the next-to-leading order level. This note summarises the discussion of the next-to-leading order multi-leg (NLM) working group on this issue which has been taking place during the workshop on Physics at TeV Colliders at Les Houches, France, in June 2009. The result is a proposal for a standard interface between Monte Carlo tools and one-loop matrix element programs.

  8. Implementation of a Tool to Modify Behavior in a Chronic Disease Management Program

    Directory of Open Access Journals (Sweden)

    Nicole D. Gillespie

    2011-01-01

    Full Text Available Chronic diseases like diabetes, hypertension, and dyslipidemia continue to be a significant burden on the US health care system. As a result, many healthcare providers are implementing strategies to prevent the incidence of heart disease and other chronic conditions. Among these strategies are proper drug therapy and lifestyle modifications. Behavior change is often the rate-limiting step in the prevention and maintenance of lifestyle modifications. The purpose of this paper is to describe a tool used to guide the progression and assess the effectiveness of a cardiovascular risk reduction program. The tool uses the Transtheoretical Model of Behavior Change to determine the readiness and confidence to change specific lifestyle behaviors pertinent to cardiovascular health. The tool aids the practitioner in developing a patient-centered plan to implement and maintain lifestyle changes and can be tailored to use in any situation requiring a behavior change on the part of the patient.

  9. GAP ANALYSIS PROGRAM GIZI DAN KESEHATAN DI POSYANDU KABUPATEN BOGOR

    Directory of Open Access Journals (Sweden)

    Ellis Endang Nikmawati

    2012-03-01

    Full Text Available 800x600 Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Revitalization of Integrated Service Center Post (Posyandu is successful when  focused to its main function as community institution services. Determine gap analysis program based on tangibles, reliability, responsiveness, assurance, empathy dimension. Exploratory and experimental designs was applied in this study, which was conducted at Darmaga and Ciomas district, Bogor Regency from March to August 2008. The data included primary and secondary data. The respondent in the experiment was 240 mothers of children under five years and 80 cadres. Gap analysis was used to know the expected and the real nutrition and health program of respondent. Totally 96 balita’s mother, pregnancy and reproductive women and 16 cadres were involved in this study. The average of gap realization with standard tool -0,75; Tangibles dimension -0,35; Reliability -0,10; Responsiveness -0,37; Assurance  -0,44, and Empathy -0,47, its mean that a tools accomplisment only 25% (less; tangible dimension 65% (enough; reliability 90% (good; responsiveness dimension 63% (enough; assurance dimension  56% (less and emphaty 53%  (less, respectively.   Key words:  posyandu performance,  nutrition education, gap analysis

  10. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  11. The Profile Envision and Splicing Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    Science.gov (United States)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release

  12. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  13. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-04-01

    Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  14. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  15. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  16. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank V [Los Alamos National Laboratory

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant

  17. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...... a probability distribution of the resource usage as a possibly uncomputable expression and then transforms it into a closed form expression using over-approximations. We present the technique, outline the implementation and show results from experiments with the system....

  18. Static Analysis of Lockless Microcontroller C Programs

    Directory of Open Access Journals (Sweden)

    Eva Beckschulze

    2012-11-01

    Full Text Available Concurrently accessing shared data without locking is usually a subject to race conditions resulting in inconsistent or corrupted data. However, there are programs operating correctly without locking by exploiting the atomicity of certain operations on a specific hardware. In this paper, we describe how to precisely analyze lockless microcontroller C programs with interrupts by taking the hardware architecture into account. We evaluate this technique in an octagon-based value range analysis using access-based localization to increase efficiency.

  19. A Program Transformation for Backwards Analysis of Logic Programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2003-01-01

    programs presented here is based on a transformation of the input program, which makes explicit the dependencies of the given program points on the initial goals. The transformation is derived from the resultants semantics of logic programs. The transformed program is then analysed using a standard...... framework and no special properties of the abstract domain....

  20. CGAT: a comparative genome analysis tool for visualizing alignments in the analysis of complex evolutionary changes between closely related genomes

    Directory of Open Access Journals (Sweden)

    Kobayashi Ichizo

    2006-10-01

    Full Text Available Abstract Background The recent accumulation of closely related genomic sequences provides a valuable resource for the elucidation of the evolutionary histories of various organisms. However, although numerous alignment calculation and visualization tools have been developed to date, the analysis of complex genomic changes, such as large insertions, deletions, inversions, translocations and duplications, still presents certain difficulties. Results We have developed a comparative genome analysis tool, named CGAT, which allows detailed comparisons of closely related bacteria-sized genomes mainly through visualizing middle-to-large-scale changes to infer underlying mechanisms. CGAT displays precomputed pairwise genome alignments on both dotplot and alignment viewers with scrolling and zooming functions, and allows users to move along the pre-identified orthologous alignments. Users can place several types of information on this alignment, such as the presence of tandem repeats or interspersed repetitive sequences and changes in G+C contents or codon usage bias, thereby facilitating the interpretation of the observed genomic changes. In addition to displaying precomputed alignments, the viewer can dynamically calculate the alignments between specified regions; this feature is especially useful for examining the alignment boundaries, as these boundaries are often obscure and can vary between programs. Besides the alignment browser functionalities, CGAT also contains an alignment data construction module, which contains various procedures that are commonly used for pre- and post-processing for large-scale alignment calculation, such as the split-and-merge protocol for calculating long alignments, chaining adjacent alignments, and ortholog identification. Indeed, CGAT provides a general framework for the calculation of genome-scale alignments using various existing programs as alignment engines, which allows users to compare the outputs of different

  1. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stregy, Seth [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Dasilva, Ana [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Yilmaz, Serkan [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Saha, Pradip [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Loewen, Eric [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States)

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  2. Analysis methodology and development of a statistical tool for biodistribution data from internal contamination with actinides.

    Science.gov (United States)

    Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne

    2017-03-01

    The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid

  3. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  4. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  5. Sensitivity analysis of an information fusion tool: OWA operator

    Science.gov (United States)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  6. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  7. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  8. BRM-Parser: a tool for comprehensive analysis of BLAST and RepeatMasker results.

    Science.gov (United States)

    Bajpai, Anjali; Sridhar, Settu; Reddy, Hemakumar M; Jesudasan, Rachel A

    2007-01-01

    BLAST and Repeat Masker Parser (BRM-Parser) is a service that provides users a unified platform for easy analysis of relatively large outputs of BLAST (Basic Local Alignment Search Tool) and RepeatMasker programs. BLAST Summary feature of BRM-Parser summarizes BLAST outputs, which can be filtered using user defined thresholds for hit length, percentage identity and E-value and can be sorted by query or subject coordinates and length of the hit. It also provides a tool that merges BLAST hits which satisfy user-defined criteria for hit length and gap between hits. The RepeatMasker Summary feature uses the RepeatMasker alignment as an input file and calculates the frequency and proportion of mutations in copies of repeat elements, as identified by the RepeatMasker. Both features can be run through a GUI or can be executed via command line using the standalone version.

  9. Visual DSD: a design and analysis tool for DNA strand displacement systems

    Science.gov (United States)

    Lakin, Matthew R.; Youssef, Simon; Polo, Filippo; Emmott, Stephen; Phillips, Andrew

    2011-01-01

    Summary: The Visual DSD (DNA Strand Displacement) tool allows rapid prototyping and analysis of computational devices implemented using DNA strand displacement, in a convenient web-based graphical interface. It is an implementation of the DSD programming language and compiler described by Lakin et al. (2011) with additional features such as support for polymers of unbounded length. It also supports stochastic and deterministic simulation, construction of continuous-time Markov chains and various export formats which allow models to be analysed using third-party tools. Availability: Visual DSD is available as a web-based Silverlight application for most major browsers on Windows and Mac OS X at http://research.microsoft.com/dna. It can be installed locally for offline use. Command-line versions for Windows, Mac OS X and Linux are also available from the web page. Contact: aphillip@microsoft.com Supplementary Information:Supplementary data are available at Bioinformatics online. PMID:21984756

  10. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  11. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  12. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  13. Evaluation tools for undergraduate program planning in times of financial austerity.

    Science.gov (United States)

    O'Palka, J; Harris, P R

    1990-05-01

    This article describes the administration and outcome of two evaluation tools developed by faculty of the dietetic program for ongoing assessment of a Plan IV dietetic education program over a 4-year period. Interns and internship directors were asked to evaluate the level of skills and knowledge base of interns compared with their internship classmates. Interns were also asked to rate the effectiveness of undergraduate course assignments and activities for internship preparation. As a result of the surveys, the home economics core course work was deleted, credits were shifted from food science to nutrient metabolism courses, and credits in clinical nutrition were increased. Projects in clinical nutrition and food systems management were modified. The surveys justified program requirements and utilization of resources, and provided an additional, effective measure of faculty competence.

  14. Development of a tool to assess psychosocial indicators of fruit and vegetable intake for 2 federal programs.

    Science.gov (United States)

    Townsend, Marilyn S; Kaiser, Lucia L

    2005-01-01

    Development of an evaluation tool of psychosocial constructs for use by participants in 2 federal programs, Food Stamp Nutrition Education and the Expanded Food and Nutrition Education Program. Cross-sectional data from a longitudinal study. Limited-resource women (n = 111) living in low-income communities. Test-retest reliability, internal consistency, ethnic differences, convergent validity. Spearman rank order correlation, analysis of variance, principal components analysis. Reliability coefficients ranged from a low of r = .18 (not significant) to r = .74 (P validity of 9 constructs led to the deletion of 3 (ie, perceived barriers, social support, and perceived norms), with retention of perceived benefits, perceived control, self-efficacy, readiness to eat more fruit, readiness to eat more vegetables, and perceived diet quality. As an estimate of convergent validity, the final version of the tool with 6 constructs remaining showed significant correlations with indicators of diet quality: serum carotenoid values (r = .38, P validation study of this type to estimate convergent validity with 5 indicators of diet quality, including a biomarker.

  15. Transition Marshall Space Flight Center Wind Profiler Splicing Algorithm to Launch Services Program Upper Winds Tool

    Science.gov (United States)

    Bauman, William H., III

    2014-01-01

    NASAs LSP customers and the future SLS program rely on observations of upper-level winds for steering, loads, and trajectory calculations for the launch vehicles flight. On the day of launch, the 45th Weather Squadron (45 WS) Launch Weather Officers (LWOs) monitor the upper-level winds and provide forecasts to the launch team via the AMU-developed LSP Upper Winds tool for launches at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station. This tool displays wind speed and direction profiles from rawinsondes released during launch operations, the 45th Space Wing 915-MHz Doppler Radar Wind Profilers (DRWPs) and KSC 50-MHz DRWP, and output from numerical weather prediction models.The goal of this task was to splice the wind speed and direction profiles from the 45th Space Wing (45 SW) 915-MHz Doppler radar Wind Profilers (DRWPs) and KSC 50-MHz DRWP at altitudes where the wind profiles overlap to create a smooth profile. In the first version of the LSP Upper Winds tool, the top of the 915-MHz DRWP wind profile and the bottom of the 50-MHz DRWP were not spliced, sometimes creating a discontinuity in the profile. The Marshall Space Flight Center (MSFC) Natural Environments Branch (NE) created algorithms to splice the wind profiles from the two sensors to generate an archive of vertically complete wind profiles for the SLS program. The AMU worked with MSFC NE personnel to implement these algorithms in the LSP Upper Winds tool to provide a continuous spliced wind profile.The AMU transitioned the MSFC NE algorithms to interpolate and fill data gaps in the data, implement a Gaussian weighting function to produce 50-m altitude intervals in each sensor, and splice the data together from both DRWPs. They did so by porting the MSFC NE code written with MATLAB software into Microsoft Excel Visual Basic for Applications (VBA). After testing the new algorithms in stand-alone VBA modules, the AMU replaced the existing VBA code in the LSP Upper Winds tool with the new

  16. RegTools: A Julia Package for Assisting Regression Analysis

    OpenAIRE

    Liang, Muzhou

    2015-01-01

    The RegTools package for Julia provides tools to select models, detect outliers and diagnose problems in regression models. The current tools include AIC, AICc and BIC based model selection methods, outlier detection methods and multicollinearity detection methods. This article briefly outlines the methodologies behind these techniques, and tests the functions by comparing with corresponding functions in R. The identical conclusions drawn from Julia and R prove the validity of RegTools.

  17. Aspects with Program Analysis for Security Policies

    DEFF Research Database (Denmark)

    Yang, Fan

    with static program analysis techniques. The former technique can separate security concerns out of the main logic, and thus improves system modularity. The latter can analyze the system behavior, and thus helps detect software bugs or potential malicious code. We present AspectKE, an aspect......-oriented extensions based on KLAIM, followed by a discussion of open joinpoints that commonly exist in coordination languages such as KLAIM. Based on the idea of AspectKE, we design and implement a proof-of-concept programming language AspectKE*, which enables programmers to easily specify analysis-based security......Enforcing security policies to IT systems, especially for a mobile distributed system, is challenging. As society becomes more IT-savvy, our expectations about security and privacy evolve. This is usually followed by changes in regulation in the form of standards and legislation. In many cases...

  18. Mondelēz Hope Kitchen Program, China: a Program Impact Pathways (PIP) analysis.

    Science.gov (United States)

    Li, Yanran; Yao, Xiaoxun; Gu, Lan

    2014-09-01

    Mondelēz Hope Kitchen is a community program initiated jointly in 2009 by Mondelēz International and the China Youth Development Foundation (CYDF). In response to the urgent needs of students, parents, and teachers at primary and middle schools in poverty-stricken rural areas of China, the program addresses the complex and intertwined issues of undernutrition and obesity. By funding both kitchen equipment and teacher training in health and nutrition, the Mondelēz Hope Kitchen Program improves the capacity of schools to supply healthy meals, helping students to access safe and nutritious foods and, ultimately, to improve their nutritional status and health. In 2011, the Mondelēz International Foundation awarded CYDF a grant to formally assess the impact of the original program design. The Mondelēz International Foundation encouraged CYDF and six other healthy lifestyles-focused community partners around the world to participate in this program evaluation workshop. The goals of this study were to describe the logic model of the Mondelēz Hope Kitchen Program, summarize a recent evaluation of the Mondelēz Hope Kitchen Program, and conduct a Program Impact Pathways (PIP) analysis to identify Critical Quality Control Points (CCPs) and a suite of impact indicators. The findings were presented at the Healthy Lifestyles Program Evaluation Workshop held in Granada, Spain, 13-14 September 2013, under the auspices of the Mondelēz International Foundation. The authors developed the program's PIP diagram based on deliberations involving the program managers and Director and consulting the "Hope Kitchen Management Rules "and "Hope Kitchen Inspection and Acceptance Report". The PIP analyses identified three CCPs: buy-in from schools, kitchen infrastructure, and changes in teachers' knowledge of nutrition after training. In addition, changes in children's knowledge of nutrition will be added to the core suite of impact evaluation indicators that also includes children

  19. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...

  20. Interaction tools for underwater shock analysis in naval platform design

    NARCIS (Netherlands)

    Aanhold, J.E.; Tuitman, J.T.; Trouwborst, W.; Vaders, J.A.A.

    2016-01-01

    In order to satisfy the need for good quality UNDerwater EXplosion (UNDEX) response estimates of naval platforms, TNO developed two 3D simulation tools: the Simplified Interaction Tool (SIT) and the hydro/structural code 3DCAV. Both tools are an add-on to LS-DYNA. SIT is a module of user routines

  1. MetaFIND: A feature analysis tool for metabolomics data

    Directory of Open Access Journals (Sweden)

    Cunningham Pádraig

    2008-11-01

    Full Text Available Abstract Background Metabolomics, or metabonomics, refers to the quantitative analysis of all metabolites present within a biological sample and is generally carried out using NMR spectroscopy or Mass Spectrometry. Such analysis produces a set of peaks, or features, indicative of the metabolic composition of the sample and may be used as a basis for sample classification. Feature selection may be employed to improve classification accuracy or aid model explanation by establishing a subset of class discriminating features. Factors such as experimental noise, choice of technique and threshold selection may adversely affect the set of selected features retrieved. Furthermore, the high dimensionality and multi-collinearity inherent within metabolomics data may exacerbate discrepancies between the set of features retrieved and those required to provide a complete explanation of metabolite signatures. Given these issues, the latter in particular, we present the MetaFIND application for 'post-feature selection' correlation analysis of metabolomics data. Results In our evaluation we show how MetaFIND may be used to elucidate metabolite signatures from the set of features selected by diverse techniques over two metabolomics datasets. Importantly, we also show how MetaFIND may augment standard feature selection and aid the discovery of additional significant features, including those which represent novel class discriminating metabolites. MetaFIND also supports the discovery of higher level metabolite correlations. Conclusion Standard feature selection techniques may fail to capture the full set of relevant features in the case of high dimensional, multi-collinear metabolomics data. We show that the MetaFIND 'post-feature selection' analysis tool may aid metabolite signature elucidation, feature discovery and inference of metabolic correlations.

  2. Kinematic Analysis of a 3-dof Parallel Machine Tool with Large Workspace

    Directory of Open Access Journals (Sweden)

    Shi Yan

    2016-01-01

    Full Text Available Kinematics of a 3-dof (degree of freedom parallel machine tool with large workspace was analyzed. The workspace volume and surface and boundary posture angles of the 3-dof parallel machine tool are relatively large. Firstly, three dimensional simulation manipulator of the 3-dof parallel machine tool was constructed, and its joint distribution was described. Secondly, kinematic models of the 3-dof parallel machine tool were fixed on, including displacement analysis, velocity analysis, and acceleration analysis. Finally, the kinematic models of the machine tool were verified by a numerical example. The study result has an important significance to the application of the parallel machine tool.

  3. Actigraphy and motion analysis: new tools for psychiatry.

    Science.gov (United States)

    Teicher, M H

    1995-01-01

    Altered locomotor activity is a cardinal sign of several psychiatric disorders. With advances in technology, activity can now be measured precisely. Contemporary studies quantifying activity in psychiatric patients are reviewed. Studies were located by a Medline search (1965 to present; English language only) cross-referencing motor activity and major psychiatric disorders. The review focused on mood disorders and attention-deficit hyperactivity disorder (ADHD). Activity levels are elevated in mania, agitated depression, and ADHD and attenuated in bipolar depression and seasonal depression. The percentage of low-level daytime activity is directly related to severity of depression, and change in this parameter accurately mirrors recovery. Demanding cognitive tasks elicit fidgeting in children with ADHD, and precise measures of activity and attention may provide a sensitive and specific marker for this disorder. Circadian rhythm analysis enhances the sophistication of activity measures. Affective disorders in children and adolescents are characterized by an attenuated circadian rhythm and an enhanced 12-hour harmonic rhythm (diurnal variation). Circadian analysis may help to distinguish between the activity patterns of mania (dysregulated) and ADHD (intact or enhanced). Persistence of hyperactivity or circadian dysregulation in bipolar patients treated with lithium appears to predict rapid relapse once medication is discontinued. Activity monitoring is a valuable research tool, with the potential to aid clinicians in diagnosis and in prediction of treatment response.

  4. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  5. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  6. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  7. ATHENA: the analysis tool for heritable and environmental network associations.

    Science.gov (United States)

    Holzinger, Emily R; Dudek, Scott M; Frase, Alex T; Pendergrass, Sarah A; Ritchie, Marylyn D

    2014-03-01

    Advancements in high-throughput technology have allowed researchers to examine the genetic etiology of complex human traits in a robust fashion. Although genome-wide association studies have identified many novel variants associated with hundreds of traits, a large proportion of the estimated trait heritability remains unexplained. One hypothesis is that the commonly used statistical techniques and study designs are not robust to the complex etiology that may underlie these human traits. This etiology could include non-linear gene × gene or gene × environment interactions. Additionally, other levels of biological regulation may play a large role in trait variability. To address the need for computational tools that can explore enormous datasets to detect complex susceptibility models, we have developed a software package called the Analysis Tool for Heritable and Environmental Network Associations (ATHENA). ATHENA combines various variable filtering methods with machine learning techniques to analyze high-throughput categorical (i.e. single nucleotide polymorphisms) and quantitative (i.e. gene expression levels) predictor variables to generate multivariable models that predict either a categorical (i.e. disease status) or quantitative (i.e. cholesterol levels) outcomes. The goal of this article is to demonstrate the utility of ATHENA using simulated and biological datasets that consist of both single nucleotide polymorphisms and gene expression variables to identify complex prediction models. Importantly, this method is flexible and can be expanded to include other types of high-throughput data (i.e. RNA-seq data and biomarker measurements). ATHENA is freely available for download. The software, user manual and tutorial can be downloaded from http://ritchielab.psu.edu/ritchielab/software.

  8. NucTools: analysis of chromatin feature occupancy profiles from high-throughput sequencing data.

    Science.gov (United States)

    Vainshtein, Yevhen; Rippe, Karsten; Teif, Vladimir B

    2017-02-14

    Biomedical applications of high-throughput sequencing methods generate a vast amount of data in which numerous chromatin features are mapped along the genome. The results are frequently analysed by creating binary data sets that link the presence/absence of a given feature to specific genomic loci. However, the nucleosome occupancy or chromatin accessibility landscape is essentially continuous. It is currently a challenge in the field to cope with continuous distributions of deep sequencing chromatin readouts and to integrate the different types of discrete chromatin features to reveal linkages between them. Here we introduce the NucTools suite of Perl scripts as well as MATLAB- and R-based visualization programs for a nucleosome-centred downstream analysis of deep sequencing data. NucTools accounts for the continuous distribution of nucleosome occupancy. It allows calculations of nucleosome occupancy profiles averaged over several replicates, comparisons of nucleosome occupancy landscapes between different experimental conditions, and the estimation of the changes of integral chromatin properties such as the nucleosome repeat length. Furthermore, NucTools facilitates the annotation of nucleosome occupancy with other chromatin features like binding of transcription factors or architectural proteins, and epigenetic marks like histone modifications or DNA methylation. The applications of NucTools are demonstrated for the comparison of several datasets for nucleosome occupancy in mouse embryonic stem cells (ESCs) and mouse embryonic fibroblasts (MEFs). The typical workflows of data processing and integrative analysis with NucTools reveal information on the interplay of nucleosome positioning with other features such as for example binding of a transcription factor CTCF, regions with stable and unstable nucleosomes, and domains of large organized chromatin K9me2 modifications (LOCKs). As potential limitations and problems we discuss how inter-replicate variability of

  9. Integrating Contemplative Tools into Biomedical Science Education and Research Training Programs

    Directory of Open Access Journals (Sweden)

    Rodney R. Dietert

    2014-01-01

    Full Text Available Academic preparation of science researchers and/or human or veterinary medicine clinicians through the science, technology, engineering, and mathematics (STEM curriculum has usually focused on the students (1 acquiring increased disciplinary expertise, (2 learning needed methodologies and protocols, and (3 expanding their capacity for intense, persistent focus. Such educational training is effective until roadblocks or problems arise via this highly-learned approach. Then, the health science trainee may have few tools available for effective problem solving. Training to achieve flexibility, adaptability, and broadened perspectives using contemplative practices has been rare among biomedical education programs. To address this gap, a Cornell University-based program involving formal biomedical science coursework, and health science workshops has been developed to offer science students, researchers and health professionals a broader array of personal, contemplation-based, problem-solving tools. This STEM educational initiative includes first-person exercises designed to broaden perceptional awareness, decrease emotional drama, and mobilize whole-body strategies for creative problem solving. Self-calibration and journaling are used for students to evaluate the personal utility of each exercise. The educational goals are to increase student self-awareness and self-regulation and to provide trainees with value-added tools for career-long problem solving. Basic elements of this educational initiative are discussed using the framework of the Tree of Contemplative Practices.

  10. Wind Atlas Analysis and Application Program: WAsP 11 Help Facility

    DEFF Research Database (Denmark)

    2014-01-01

    The Wind Atlas Analysis and Application Program (WAsP) is a PC-program for horizontal and vertical extrapolation of wind climates. The program contains a complete set of models to calculate the effects on the wind of sheltering obstacles, surface roughness changes and terrain height variations...... of specific wind turbines and wind farms. The WAsP Help Facility includes a Quick Start Tutorial, a User's Guide and a Technical Reference. It further includes descriptions of the Observed Wind Climate Wizard, the WAsP Climate Analyst, the WAsP Map Editor tool, the WAsP Turbine Editor tool, the Air Density...

  11. Promoting Diversity through Program Websites: A Multicultural Content Analysis of School Psychology Program Websites

    Science.gov (United States)

    Smith, Leann V.; Blake, Jamilia J.; Graves, Scott L.; Vaughan-Jensen, Jessica; Pulido, Ryne; Banks, Courtney

    2016-01-01

    The recruitment of culturally and linguistically diverse students to graduate programs is critical to the overall growth and development of school psychology as a field. Program websites serve as an effective recruitment tool for attracting prospective students, yet there is limited research on how school psychology programs use their websites to…

  12. Agent-based modeling as a tool for program design and evaluation.

    Science.gov (United States)

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. BRNI: Modular analysis of transcriptional regulatory programs

    Directory of Open Access Journals (Sweden)

    Nachman Iftach

    2009-05-01

    Full Text Available Abstract Background Transcriptional responses often consist of regulatory modules – sets of genes with a shared expression pattern that are controlled by the same regulatory mechanisms. Previous methods allow dissecting regulatory modules from genomics data, such as expression profiles, protein-DNA binding, and promoter sequences. In cases where physical protein-DNA data are lacking, such methods are essential for the analysis of the underlying regulatory program. Results Here, we present a novel approach for the analysis of modular regulatory programs. Our method – Biochemical Regulatory Network Inference (BRNI – is based on an algorithm that learns from expression data a biochemically-motivated regulatory program. It describes the expression profiles of gene modules consisting of hundreds of genes using a small number of regulators and affinity parameters. We developed an ensemble learning algorithm that ensures the robustness of the learned model. We then use the topology of the learned regulatory program to guide the discovery of a library of cis-regulatory motifs, and determined the motif compositions associated with each module. We test our method on the cell cycle regulatory program of the fission yeast. We discovered 16 coherent modules, covering diverse processes from cell division to metabolism and associated them with 18 learned regulatory elements, including both known cell-cycle regulatory elements (MCB, Ace2, PCB, ACCCT box and novel ones, some of which are associated with G2 modules. We integrate the regulatory relations from the expression- and motif-based models into a single network, highlighting specific topologies that result in distinct dynamics of gene expression in the fission yeast cell cycle. Conclusion Our approach provides a biologically-driven, principled way for deconstructing a set of genes into meaningful transcriptional modules and identifying their associated cis-regulatory programs. Our analysis sheds

  14. Energy Analysis Program. 1992 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    The Program became deeply involved in establishing 4 Washington, D.C., project office diving the last few months of fiscal year 1942. This project office, which reports to the Energy & Environment Division, will receive the majority of its support from the Energy Analysis Program. We anticipate having two staff scientists and support personnel in offices within a few blocks of DOE. Our expectation is that this office will carry out a series of projects that are better managed closer to DOE. We also anticipate that our representation in Washington will improve and we hope to expand the Program, its activities, and impact, in police-relevant analyses. In spite of the growth that we have achieved, the Program continues to emphasize (1) energy efficiency of buildings, (2) appliance energy efficiency standards, (3) energy demand forecasting, (4) utility policy studies, especially integrated resource planning issues, and (5) international energy studies, with considerate emphasis on developing countries and economies in transition. These continuing interests are reflected in the articles that appear in this report.

  15. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  16. Extracting Sentiment from Healthcare Survey Data: An Evaluation of Sentiment Analysis Tools

    OpenAIRE

    Georgiou, D.; MacFarlane, A.; Russell-Rose, T.

    2015-01-01

    Sentiment analysis is an emerging discipline with many analytical tools available. This project aimed to examine a number of tools regarding their suitability for healthcare data. A comparison between commercial and non-commercial tools was made using responses from an online survey which evaluated design changes made to a clinical information service. The commercial tools were Semantria and TheySay and the non-commercial tools were WEKA and Google Prediction API. Different approaches were fo...

  17. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    Science.gov (United States)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  18. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  19. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  20. THEME: a web tool for loop-design microarray data analysis.

    Science.gov (United States)

    Chen, Chaang-Ray; Shu, Wun-Yi; Tsai, Min-Lung; Cheng, Wei-Chung; Hsu, Ian C

    2012-02-01

    A number of recent studies have shown that loop-design is more efficient than reference control design. Data analysis for loop-design microarray experiments is commonly undertaken using linear models and statistical tests. These techniques require specialized knowledge in statistical programming. However, limited loop-design web-based tools are available. We have developed the THEME (Tsing Hua Engine of Microarray Experiment) that exploits all necessary data analysis tools for loop-design microarray studies. THEME allows users to construct linear models and to apply multiple user-defined statistical tests of hypotheses for detection of DEG (differentially expressed genes). Users can modify entries of design matrix for experimental design as well as that of contrast matrix for statistical tests of hypotheses. The output of multiple user-defined statistical tests of hypotheses, DEG lists, can be cross-validated. The web platform provides data assessment and visualization tools that significantly assist users when evaluating the performance of microarray experimental procedures. THEME is also a MIAME (Minimal Information About a Microarray Experiment) compliant system, which enables users to export formatted files for GEO (Gene Expression Omnibus) submission. THEME offers comprehensive web services to biologists for data analysis of loop-design microarray experiments. This web-based resource is especially useful for core facility service as well as collaboration projects when researchers are not at the same site. Data analysis procedures, starting from uploading raw data files to retrieving DEG lists, can be flexibly operated with natural workflows. These features make THEME a reliable and powerful on-line system for data analysis of loop-design microarrays. The THEME server is available at http://metadb.bmes.nthu.edu.tw/theme/. Copyright © 2011 Elsevier Ltd. All rights reserved.