WorldWideScience

Sample records for analysis tool extensions

  1. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  2. An extensive (co-expression analysis tool for the cytochrome P450 superfamily in Arabidopsis thaliana

    Directory of Open Access Journals (Sweden)

    Provart Nicholas J

    2008-04-01

    Full Text Available Abstract Background Sequencing of the first plant genomes has revealed that cytochromes P450 have evolved to become the largest family of enzymes in secondary metabolism. The proportion of P450 enzymes with characterized biochemical function(s is however very small. If P450 diversification mirrors evolution of chemical diversity, this points to an unexpectedly poor understanding of plant metabolism. We assumed that extensive analysis of gene expression might guide towards the function of P450 enzymes, and highlight overlooked aspects of plant metabolism. Results We have created a comprehensive database, 'CYPedia', describing P450 gene expression in four data sets: organs and tissues, stress response, hormone response, and mutants of Arabidopsis thaliana, based on public Affymetrix ATH1 microarray expression data. P450 expression was then combined with the expression of 4,130 re-annotated genes, predicted to act in plant metabolism, for co-expression analyses. Based on the annotation of co-expressed genes from diverse pathway annotation databases, co-expressed pathways were identified. Predictions were validated for most P450s with known functions. As examples, co-expression results for P450s related to plastidial functions/photosynthesis, and to phenylpropanoid, triterpenoid and jasmonate metabolism are highlighted here. Conclusion The large scale hypothesis generation tools presented here provide leads to new pathways, unexpected functions, and regulatory networks for many P450s in plant metabolism. These can now be exploited by the community to validate the proposed functions experimentally using reverse genetics, biochemistry, and metabolic profiling.

  3. The integrated microbial genomes (IMG) system in 2007: datacontent and analysis tool extensions

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Szeto, Ernest; Palaniappan, Krishna; Grechkin, Yuri; Chu, Ken; Chen, I-Min A.; Dubchak, Inna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2007-08-01

    The Integrated Microbial Genomes (IMG) system is a data management, analysis and annotation platform for all publicly available genomes. IMG contains both draft and complete JGI microbial genomes integrated with all other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and annotating genomes, genes and functions, individually or in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through quarterly releases. IMG is provided by the DOE-Joint Genome Institute (JGI) and is available from http://img.jgi.doe.gov.

  4. Tools for Creating Mobile Applications for Extension

    Science.gov (United States)

    Drill, Sabrina L.

    2012-01-01

    Considerations and tools for developing mobile applications for Extension include evaluating the topic, purpose, and audience. Different computing platforms may be used, and apps designed as modified Web pages or implicitly programmed for a particular platform. User privacy is another important consideration, especially for data collection apps.…

  5. Versatile and Extensible, Continuous-Thrust Trajectory Optimization Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop an innovative, versatile and extensible, continuous-thrust trajectory optimization tool for planetary mission design and optimization of...

  6. Interactive Whiteboards: A New Tool for Extension Education

    Science.gov (United States)

    Schroeder, Mary M.; Burns, Connie S.; Reicks, Marla M.

    2011-01-01

    Use of interactive whiteboards (IWBs) in school classrooms and conference rooms is increasing. To evaluate the effectiveness of IWBs as a tool for Extension education, two groups of 3rd and 4th grade Minnesota students (n=325) were taught nutrition using traditional methods or IWBs. Significant increases in knowledge and behavior were observed in…

  7. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  8. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  9. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  10. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  11. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  12. Extensive analysis of hydrogen costs

    Energy Technology Data Exchange (ETDEWEB)

    Guinea, D.M.; Martin, D.; Garcia-Alegre, M.C.; Guinea, D. [Consejo Superior de Investigaciones Cientificas, Arganda, Madrid (Spain). Inst. de Automatica Industrial; Agila, W.E. [Acciona Infraestructuras, Alcobendas, Madrid (Spain). Dept. I+D+i

    2010-07-01

    Cost is a key issue in the spreading of any technology. In this work, the cost of hydrogen is analyzed and determined, for hydrogen obtained by electrolysis. Different contributing partial costs are taken into account to calculate the hydrogen final cost, such as energy and electrolyzers taxes. Energy cost data is taken from official URLs, while electrolyzer costs are obtained from commercial companies. The analysis is accomplished under different hypothesis, and for different countries: Germany, France, Austria, Switzerland, Spain and the Canadian region of Ontario. Finally, the obtained costs are compared to those of the most used fossil fuels, both in the automotive industry (gasoline and diesel) and in the residential sector (butane, coal, town gas and wood), and the possibilities of hydrogen competing against fuels are discussed. According to this work, in the automotive industry, even neglecting subsidies, hydrogen can compete with fossil fuels. Hydrogen can also compete with gaseous domestic fuels. Electrolyzer prices were found to have the highest influence on hydrogen prices. (orig.)

  13. Producing Organic Cotton: A Toolkit - Crop Guide, Projekt guide, Extension tools

    OpenAIRE

    Eyhorn, Frank

    2005-01-01

    The CD compiles the following extension tools on organic cotton: Organic Cotton Crop Guide, Organic Cotton Training Manual, Soil Fertility Training Manual, Organic Cotton Project Guide, Record keeping tools, Video "Organic agriculture in the Nimar region", Photos for illustration.

  14. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  15. A standardised knowledge test to measure the extent of knowledge of agricultural extension personnel on m-tools

    Directory of Open Access Journals (Sweden)

    Kusuma Kumari Nagam

    2016-11-01

    Full Text Available A standardised teacher made test for assessing the extent of knowledge of agricultural extension on m-tools was developed using the item analysis procedure. For the purpose, a teacher made test consisting of 30 items were prepared and administered to 30 agricultural extension personnel. Based on the results of the study,14 items having difficulty index value ranging from 20 to 80 and discrimination index value above 0.10 were selected to construct the knowledge test. This standardised test can be used to measure knowledge level of the extension personnel on m-tools.

  16. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  17. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  18. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  19. Analysis of Extension Categorical Data Mining Process for the Extension Interior Designing

    Institute of Scientific and Technical Information of China (English)

    Hui Ma; Guangtian Zou

    2016-01-01

    On the basis of extension architectonics, this paper researches the process of extension categorical data mining for extension interior design. In accordance with the theory of extension data mining, the extension categorical data mining for the extension interior design can be divided into data preparation, the operation of mining and knowledge application. The paper expatiates the main content and cohesive relations of each link, and emphatically discusses extension acquisition, analysis extension, categorical mining extension, knowledge application extension and other several core nodes that are related with data. Through the knowledge fusion of extension architectonics and data mining, the paper discusses the process of knowledge requirements with multiple classification under different mining targets. The purpose of this paper is to explore a whole categorical data mining process of interior design from extension design data to the design of knowledge discovery and extension application.

  20. AvoPlot: An extensible scientific plotting tool based on matplotlib

    Directory of Open Access Journals (Sweden)

    Nial Peters

    2014-02-01

    Full Text Available AvoPlot is a simple-to-use graphical plotting program written in Python and making extensive use of the matplotlib plotting library. It can be found at http://code.google.com/p/avoplot/. In addition to providing a user-friendly interface to the powerful capabilities of the matplotlib library, it also offers users the possibility of extending its functionality by creating plug-ins. These can import specific types of data into the interface and also provide new tools for manipulating them. In this respect, AvoPlot is a convenient platform for researchers to build their own data analysis tools on top of, as well as being a useful standalone program.

  1. Udder Hygiene Analysis tool

    OpenAIRE

    2013-01-01

    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  2. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extrac...... and analyze web data in the process of investigating substantive questions......., analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  3. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  4. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  5. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  6. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    Science.gov (United States)

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  7. Sandia PUF Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  8. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  9. The Brazilian Experience with Agroecological Extension: A Critical Analysis of Reform in a Pluralistic Extension System

    Science.gov (United States)

    Diesel, Vivien; Miná Dias, Marcelo

    2016-01-01

    Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…

  10. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  11. LOOS: an extensible platform for the structural analysis of simulations.

    Science.gov (United States)

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  12. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  13. Organizing to Use Facebook Advertisements: A Planning Tool for Extension Professionals, Businesses, and Communities

    Science.gov (United States)

    Barnes, James

    2016-01-01

    The purpose of this article is to explain how Extension professionals, businesses, and communities can use Facebook advertisements effectively. The article is a planning tool that introduces Facebook's Advertiser Help Center, explains some applicable key concepts, and suggests best practices to apply before launching a Facebook advertising…

  14. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  15. Extensible and object-oriented system Eos supplies a new environment for image analysis of electron micrographs of macromolecules.

    Science.gov (United States)

    Yasunaga, T; Wakabayashi, T

    1996-01-01

    To study macromolecular structure by electron microscopy, a highly extensible and object-oriented system has been developed for image analysis. This system is named "Eos" (Extensible and object-oriented system). The system described here supplies an environment with four types of supports: (i) a group of small tools for image analysis, (ii) tools for integration of small tools, such as "Display2," (iii) tools for development, such as "maketool," and (iv) object-oriented libraries for development of new tools. Using Eos, electron micrographs can be analyzed by small tools and integration tools. In addition, Eos can be used to develop new tools based on new ideas because development tool and object-oriented libraries are provided. The examples of implemented small tools for image analysis include three-dimensional reconstruction of objects with helical symmetry, cluster analysis, and contour expression.

  16. Shot Planning and Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  17. Graphical Multiprocessing Analysis Tool (GMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.

    1988-03-01

    The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

  18. ANALYSIS OF THE PROTECTED EXTENSIBLE AUTHENTICATION PROTOCOL

    Directory of Open Access Journals (Sweden)

    Amit Rana

    2012-09-01

    Full Text Available The Internet Engineering Task Force (IETF has proposednew protocols for highly secured wireless networking. Thepurpose of this paper is to implement one such proposedsecurity protocol - PEAP (Protected ExtensibleAuthentication Protocol [1]. PEAP was jointly developedby Microsoft, Cisco and RSA security. The protocolimplementation is done on the server end of a Client/Servernetwork model on a RADIUS server (RemoteAuthentication Dial-in User Service. The proposedprotocol - PEAP provides for Client identity protection andkey generation thus preventing unauthorized user accessand protecting or encrypting the data against maliciousactivities.

  19. EXTENSION TO THEORY OF TIME SERIES ANALYSIS

    Science.gov (United States)

    vector-valued processes; analysis of processes with vector arguments; and the influence of finite sample size on the covariance matrices of the least square and Markov estimates in regression analysis. (Author)

  20. Extension of an Object-Oriented Optimization Tool: User's Reference Manual

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.

  1. Extensions in model-based system analysis

    OpenAIRE

    Graham, Matthew R.

    2007-01-01

    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  2. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  3. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  4. Image Segmentation and Analysis of Flexion-Extension Radiographs of Cervical Spines

    OpenAIRE

    Eniko T. Enikov; Rein Anton

    2014-01-01

    We present a new analysis tool for cervical flexion-extension radiographs based on machine vision and computerized image processing. The method is based on semiautomatic image segmentation leading to detection of common landmarks such as the spinolaminar (SL) line or contour lines of the implanted anterior cervical plates. The technique allows for visualization of the local curvature of these landmarks during flexion-extension experiments. In addition to changes in the curvature of the SL lin...

  5. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  6. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  7. A Meta-Analysis of Extensive Reading Research

    Science.gov (United States)

    Nakanishi, Takayuki

    2015-01-01

    The purposes of this study were to investigate the overall effectiveness of extensive reading, whether learners' age impacts learning, and whether the length of time second language learners engage in extensive reading influences test scores. The author conducted a meta-analysis to answer research questions and to identify future research…

  8. Mapping Extension's Networks: Using Social Network Analysis to Explore Extension's Outreach

    Science.gov (United States)

    Bartholomay, Tom; Chazdon, Scott; Marczak, Mary S.; Walker, Kathrin C.

    2011-01-01

    The University of Minnesota Extension conducted a social network analysis (SNA) to examine its outreach to organizations external to the University of Minnesota. The study found that its outreach network was both broad in its reach and strong in its connections. The study found that SNA offers a unique method for describing and measuring Extension…

  9. ASAP: An Extensible Platform for State Space Analysis

    DEFF Research Database (Denmark)

    Westergaard, Michael; Evangelista, Sami; Kristensen, Lars Michael

    2009-01-01

    The ASCoVeCo State space Analysis Platform (ASAP) is a tool for performing explicit state space analysis of coloured Petri nets (CPNs) and other formalisms. ASAP supports a wide range of state space reduction techniques and is intended to be easy to extend and to use, making it a suitable tool...

  10. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development.

  11. Design and analysis tool validation

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.

    1981-07-01

    The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

  12. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  13. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  14. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  15. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of temporal maximum autocorrelation factor analysis to global monthly mean values of 1996-1997 sea surface temperature (SST) and sea surface height (SSH) data. This type of analysis can be considered as an extension of traditional empirical orthogonal function...

  16. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    Directory of Open Access Journals (Sweden)

    Brinkley James F

    2007-10-01

    Full Text Available Abstract Background Three-dimensional (3-D visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  17. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    /steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  18. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 3: PEP analysis and tradeoffs

    Science.gov (United States)

    1979-01-01

    The objectives, conclusions, and approaches for accomplishing 19 specific design and analysis activities related to the installation of the power extension package (PEP) into the Orbiter cargo bay are described as well as those related to its deployment, extension, and retraction. The proposed cable handling system designed to transmit power from PEP to the Orbiter by way of the shuttle remote manipulator system is described and a preliminary specification for the gimbal assembly, solar array drive is included.

  19. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  20. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  1. 2010 Solar Market Transformation Analysis and Tools

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  2. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  3. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  4. Tools for Basic Statistical Analysis

    Science.gov (United States)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  5. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  6. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  7. Stochastic Simulation Tool for Aerospace Structural Analysis

    Science.gov (United States)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  8. Image Segmentation and Analysis of Flexion-Extension Radiographs of Cervical Spines

    Directory of Open Access Journals (Sweden)

    Eniko T. Enikov

    2014-01-01

    Full Text Available We present a new analysis tool for cervical flexion-extension radiographs based on machine vision and computerized image processing. The method is based on semiautomatic image segmentation leading to detection of common landmarks such as the spinolaminar (SL line or contour lines of the implanted anterior cervical plates. The technique allows for visualization of the local curvature of these landmarks during flexion-extension experiments. In addition to changes in the curvature of the SL line, it has been found that the cervical plates also deform during flexion-extension examination. While extension radiographs reveal larger curvature changes in the SL line, flexion radiographs on the other hand tend to generate larger curvature changes in the implanted cervical plates. Furthermore, while some lordosis is always present in the cervical plates by design, it actually decreases during extension and increases during flexion. Possible causes of this unexpected finding are also discussed. The described analysis may lead to a more precise interpretation of flexion-extension radiographs, allowing diagnosis of spinal instability and/or pseudoarthrosis in already seemingly fused spines.

  9. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  10. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  11. Teaching Methods and Tools Used In Food Safety Extension Education Programs in the North Central Region of the United States

    Directory of Open Access Journals (Sweden)

    Robert A. Martin

    2011-09-01

    Full Text Available One of the ways to ensure food safety is to educate thepublic. Of the organizations providing food safety educationin the United States (U.S., the Cooperative Extension System(CES is one of the most reliable. The effectiveness CESprograms depends not only on what is being taught but also onhow it is taught. Both a needs-based curriculum and how thatcurriculum is delivered are equally important. This descriptivecross-sectional study using a disproportional stratified randomsample identified the teaching methods and tools being used byfood safety extension educators of the CES of North CentralRegion (NCR. A Likert-type scale administered to extensioneducators revealed that they were adopting a balanced use ofteaching methods and tools, and using learner-centered teachingmethods in their programs. However, distance education, casestudies and podcasts, which are commonly used in educationprograms, were not being used extensively. We recommend thatfood safety extension educators of NCR should increase the useof these two teaching methods and tool while continuing to usethe current ones. This study has implications for improving foodsafety education delivery to clients in the NCR and for designinginservice education for food safety extension educators

  12. Accuracy Analysis and Calibration of Gantry Hybrid Machine Tool

    Institute of Scientific and Technical Information of China (English)

    唐晓强; 李铁民; 尹文生; 汪劲松

    2003-01-01

    The kinematic accuracy is a key factor in the design of parallel or hybrid machine tools. This analysis improved the accuracy of a 4-DOF (degree of freedom) gantry hybrid machine tool based on a 3-DOF planar parallel manipulator by compensating for various positioning errors. The machine tool architecture was described with the inverse kinematic solution. The control parameter error model was used to analyze the accuracy of the 3-DOF planar parallel manipulator and to develop a kinematic calibration method. The experimental results prove that the calibration method reduces the cutter nose errors from ±0.50 mm to ±0.03 mm for a horizontal movement of 600 mm by compensating for errors in the slider home position, the guide way distance and the extensible strut home position. The calibration method will be useful for similar types of parallel kinematic machines.

  13. Performance analysis of GYRO: a tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Candy, J [General Atomics, PO Box 85608, San Diego, CA 92186-5608 (United States); Carrington, L [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Huck, K [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Kaiser, T [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Mahinthakumar, G [Department of Civil Engineering, North Carolina State University, Raleigh, NC 27695-7908 (United States); Malony, A [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Moore, S [Innovative Computing Laboratory, University of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Reed, D [Renaissance Computing Institute, University of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States); Roth, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Shan, H [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Shende, S [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Snavely, A [San Diego Supercomputer Center, Univ. of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Sreepathi, S [Dept. of Computer Science, North Carolina State Univ., Raleigh, NC 27695-7908 (United States); Wolf, F [Innovative Computing Lab., Univ. of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Zhang, Y [Renaissance Computing Inst., Univ. of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States)

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  14. Performance Analysis of GYRO: A Tool Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  15. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  16. Cost Effectiveness Ratio: Evaluation Tool for Comparing the Effectiveness of Similar Extension Programs

    Science.gov (United States)

    Jayaratne, K. S. U.

    2015-01-01

    Extension educators have been challenged to be cost effective in their educational programming. The cost effectiveness ratio is a versatile evaluation indicator for Extension educators to compare the cost of achieving a unit of outcomes or educating a client in similar educational programs. This article describes the cost effectiveness ratio and…

  17. General Analysis Tool Box for Controlled Perturbation

    CERN Document Server

    Osbild, Ralf

    2012-01-01

    The implementation of reliable and efficient geometric algorithms is a challenging task. The reason is the following conflict: On the one hand, computing with rounded arithmetic may question the reliability of programs while, on the other hand, computing with exact arithmetic may be too expensive and hence inefficient. One solution is the implementation of controlled perturbation algorithms which combine the speed of floating-point arithmetic with a protection mechanism that guarantees reliability, nonetheless. This paper is concerned with the performance analysis of controlled perturbation algorithms in theory. We answer this question with the presentation of a general analysis tool box. This tool box is separated into independent components which are presented individually with their interfaces. This way, the tool box supports alternative approaches for the derivation of the most crucial bounds. We present three approaches for this task. Furthermore, we have thoroughly reworked the concept of controlled per...

  18. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  19. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    Directory of Open Access Journals (Sweden)

    Aitman T

    2008-11-01

    Full Text Available Abstract Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System is a multi-user rich internet application (RIA providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data

  20. Extensions of nonlinear error propagation analysis for explicit pseudodynamic testing

    Institute of Scientific and Technical Information of China (English)

    Shuenn-Yih Chang

    2009-01-01

    Two important extensions of a technique to perform a nonlinear error propagation analysis for an explicit pseudodynamic algorithm (Chang, 2003) are presented. One extends the stability study from a given time step to a complete step-by-step integration procedure. It is analytically proven that ensuring stability conditions in each time step leads to a stable computation of the entire step-by-step integration procedure. The other extension shows that the nonlinear error propagation results, which are derived for a nonlinear single degree of freedom (SDOF) system, can be applied to a nonlinear multiple degree of freedom (MDOF) system. This application is dependent upon the determination of the natural frequencies of the system in each time step, since all the numerical properties and error propagation properties in the time step are closely related to these frequencies. The results are derived from the step degree of nonlinearity. An instantaneous degree of nonlinearity is introduced to replace the step degree of nonlinearity and is shown to be easier to use in practice. The extensions can be also applied to the results derived from a SDOF system based on the instantaneous degree of nonlinearity, and hence a time step might be appropriately chosen to perform a pseudodynamic test prior to testing.

  1. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  2. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  3. Applying New Diabetes Teaching Tools in Health-Related Extension Programming

    Science.gov (United States)

    Grenci, Alexandra

    2010-01-01

    In response to the emerging global diabetes epidemic, health educators are searching for new and better education tools to help people make positive behavior changes to successfully prevent or manage diabetes. Conversation Maps[R] are new learner-driven education tools that have been developed to empower individuals to improve their health…

  4. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  5. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  6. Use of Interactive Electronic Audience Response Tools (Clickers) to Evaluate Knowledge Gained in Extension Programming

    Science.gov (United States)

    Gunn, Patrick; Loy, Dan

    2015-01-01

    Effectively measuring short-term impact, particularly a change in knowledge resulting from Extension programming, can prove to be challenging. Clicker-based technology, when used properly, is one alternative that may allow educators to better evaluate this aspect of the logic model. While the potential interface between clicker technology and…

  7. Animal Agriculture in a Changing Climate Online Course: An Effective Tool for Creating Extension Competency

    Science.gov (United States)

    Whitefield, Elizabeth; Schmidt, David; Witt-Swanson, Lindsay; Smith, David; Pronto, Jennifer; Knox, Pam; Powers, Crystal

    2016-01-01

    There is a need to create competency among Extension professionals on the topic of climate change adaptation and mitigation in animal agriculture. The Animal Agriculture in a Changing Climate online course provides an easily accessible, user-friendly, free, and interactive experience for learning science-based information on a national and…

  8. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  9. A Divergence Statistics Extension to VTK for Performance Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  10. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 2: PEP

    Science.gov (United States)

    1979-01-01

    User power, duration, and orbit requirements, which were the prime factors influencing power extension package (PEP) design, are discussed. A representative configuration of the PEP concept is presented and the major elements of the system are described as well as the PEP-to-Orbiter and remote manipulator interface provisions.

  11. RSAT 2015: Regulatory Sequence Analysis Tools.

    Science.gov (United States)

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  12. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  13. CAD Extensions and Other Refinements to the LOCATE Workplace Layout Tool

    Science.gov (United States)

    2000-05-01

    le cadre des travaux du Centre canadien des operations regionales. L’outil pourrait aussi permettre de planifier l’amenagement de bureaux et d’usmes...script to ease the installation process and a port of the LOCATE Workspace Layout Tool to the PC. There are several items of the former type, not

  14. A review of ADM1 extensions, applications, and analysis: 2002-2005.

    Science.gov (United States)

    Batstone, D J; Keller, J; Steyer, J P

    2006-01-01

    Since publication of the Scientific and Technical Report (STR) describing the ADM1, the model has been extensively used, and analysed in both academic and practical applications. Adoption of the ADM1 in popular systems analysis tools such as the new wastewater benchmark (BSM2), and its use as a virtual industrial system can stimulate modelling of anaerobic processes by researchers and practitioners outside the core expertise of anaerobic processes. It has been used as a default structural element that allows researchers to concentrate on new extensions such as sulfate reduction, and new applications such as distributed parameter modelling of biofilms. The key limitations for anaerobic modelling originally identified in the STR were: (i) regulation of products from glucose fermentation, (ii) parameter values, and variability, and (iii) specific extensions. Parameter analysis has been widespread, and some detailed extensions have been developed (e.g., sulfate reduction). A verified extension that describes regulation of products from glucose fermentation is still limited, though there are promising fundamental approaches. This is a critical issue, given the current interest in renewable hydrogen production from carbohydrate-type waste. Critical analysis of the model has mainly focused on model structure reduction, hydrogen inhibition functions, and the default parameter set recommended in the STR. This default parameter set has largely been verified as a reasonable compromise, especially for wastewater sludge digestion. One criticism of note is that the ADM1 stoichiometry focuses on catabolism rather than anabolism. This means that inorganic carbon can be used unrealistically as a carbon source during some anabolic reactions. Advances and novel applications have also been made in the present issue, which focuses on the ADM1. These papers also explore a number of novel areas not originally envisaged in this review.

  15. Extension of an Object Oriented Multidisciplinary Analysis Optimization (MDAO) Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Multidisciplinary design, analysis, and optimization (MDAO) tools today possess limited disciplines with little fidelity modeling capability. These tools are...

  16. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  17. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  18. Stacks: an analysis tool set for population genomics.

    Science.gov (United States)

    Catchen, Julian; Hohenlohe, Paul A; Bassham, Susan; Amores, Angel; Cresko, William A

    2013-06-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics.

  19. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  20. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  1. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  2. A Discrete Event Simulator for Extensive Defense Mechanism for Denial of Service Attacks Analysis

    Directory of Open Access Journals (Sweden)

    Maryam Tanha

    2012-01-01

    Full Text Available Problem statement: Seeking for defense mechanisms against low rate Denial of Service (DoS attacks as a new generation of DoS attacks has received special attention during recent years. As a decisive factor, evaluating the performance of the offered mitigation techniques based on different metrics for determining the viability and ability of these countermeasures requires more research. Approach: The development of a new generalized discrete event simulator has been deliberated in detail. The research conducted places high emphasis on the benefits of creating a customized discrete event simulator for the analysis of security and in particular the DoS attacks. The simulator possesses a niche in terms of the small scale, low execution time, portability and ease of use. The attributes and mechanism of the developed simulator is complemented with the proposed framework. Results: The simulator has been extensively evaluated and has proven to provide an ideal tool for the analysis and exploration of DoS attacks. In-depth analysis is enabled by this simulator for creating multitudes of defense mechanisms against HTTP low rate DoS attacks. The acquired results from the simulation tool have been compared against a simulator from the same domain. Subsequently, it enables the validation of developed simulator utilizing selected performance metrics including mean in-system time, average delay and average buffer size. Conclusion: The proposed simulator serves as an efficient and scalable performance analysis tool for the analysis of HTTP low rate DoS attack defense mechanism. Future work can encompass the development of discrete event simulators for analysis of other security issues such as Intrusion Detection Systems.

  3. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  4. Small-Body Extensions for the Satellite Orbit Analysis Program (SOAP)

    Science.gov (United States)

    Carnright, Robert; Stodden, David; Coggi, John

    2008-01-01

    An extension to the SOAP software allows users to work with tri-axial ellipsoid-based representations of planetary bodies, primarily for working with small, natural satellites, asteroids, and comets. SOAP is a widely used tool for the visualization and analysis of space missions. The small body extension provides the same visualization and analysis constructs for use with small bodies. These constructs allow the user to characterize satellite path and instrument cover information for small bodies in both 3D display and numerical output formats. Tri-axial ellipsoids are geometric shapes the diameters of which are different in each of three principal x, y, and z dimensions. This construct provides a better approximation than using spheres or oblate spheroids (ellipsoids comprising two common equatorial diameters as a distinct polar diameter). However, the tri-axial ellipsoid is considerably more difficult to work with from a modeling perspective. In addition, the SOAP small-body extensions allow the user to actually employ a plate model for highly irregular surfaces. Both tri-axial ellipsoids and plate models can be assigned to coordinate frames, thus allowing for the modeling of arbitrary changes to body orientation. A variety of features have been extended to support tri-axial ellipsoids, including the computation and display of the spacecraft sub-orbital point, ground trace, instrument footprints, and swathes. Displays of 3D instrument volumes can be shown interacting with the ellipsoids. Longitude/latitude grids, contour plots, and texture maps can be displayed on the ellipsoids using a variety of projections. The distance along an arbitrary line of sight can be computed between the spacecraft and the ellipsoid, and the coordinates of that intersection can be plotted as a function of time. The small-body extension supports the same visual and analytical constructs that are supported for spheres and oblate spheroids in SOAP making the implementation of the more

  5. Using the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  6. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  7. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  8. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  9. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  10. Timeline analysis tools for law enforcement

    Science.gov (United States)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  11. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  12. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  13. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  14. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  15. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  16. GATA: a graphic alignment tool for comparative sequence analysis

    Directory of Open Access Journals (Sweden)

    Nix David A

    2005-01-01

    Full Text Available Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dot plot analysis is often used to estimate non-coding sequence relatedness. Yet dot plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments. Results To address some of these issues, we created a stand alone, platform independent, graphic alignment tool for comparative sequence analysis (GATA http://gata.sourceforge.net/. GATA uses the NCBI-BLASTN program and extensive post-processing to identify all small sub-alignments above a low cut-off score. These are graphed as two shaded boxes, one for each sequence, connected by a line using the coordinate system of their parent sequence. Shading and colour are used to indicate score and orientation. A variety of options exist for querying, modifying and retrieving conserved sequence elements. Extensive gene annotation can be added to both sequences using a standardized General Feature Format (GFF file. Conclusions GATA uses the NCBI-BLASTN program in conjunction with post-processing to exhaustively align two DNA

  17. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  18. Analysis of extensive air showers with the hybrid code SENECA

    CERN Document Server

    Ortiz, J A; Medina-Tanco, G; Ortiz, Jeferson A.; Souza, Vitor de; Medina-Tanco, Gustavo

    2005-01-01

    The ultrahigh energy tail of the cosmic ray spectrum has been explored with unprecedented detail. For this reason, new experiments are exerting a severe pressure on extensive air shower modeling. Detailed fast codes are in need in order to extract and understand the richness of information now available. In this sense we explore the potential of SENECA, an efficient hybrid tridimensional simulation code, as a valid practical alternative to full Monte Carlo simulations of extensive air showers generated by ultrahigh energy cosmic rays. We discuss the influence of this approach on the main longitudinal characteristics of proton, iron nucleus and gamma induced air showers for different hadronic interaction models. We also show the comparisons of our predictions with those of CORSIKA code.

  19. Analysis of extensive air showers with the hybrid code SENECA

    Science.gov (United States)

    Ortiz, Jeferson A.; de Souza, Vitor; Medina-Tanco, Gustavo

    The ultrahigh energy tail of the cosmic ray spectrum has been explored with unprecedented detail. For this reason, new experiments are exerting a severe pressure on extensive air shower modeling. Detailed fast codes are in need in order to extract and understand the richness of information now available. In this sense we explore the potential of SENECA, an efficient hybrid tridimensional simulation code, as a valid practical alternative to full Monte Carlo simulations of extensive air showers generated by ultrahigh energy cosmic rays. We discuss the influence of this approach on the main longitudinal characteristics of proton, iron nucleus and gamma induced air showers for different hadronic interaction models. We also show the comparisons of our predictions with those of CORSIKA code.

  20. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  1. Meta-analysis of Factors Affecting Occupational and Professional Performance of Agricultural Extension Agents

    Directory of Open Access Journals (Sweden)

    Mahshid Bahadori

    2016-03-01

    Full Text Available The purpose of this research is to do a meta-analysis of studies about the results of researches conducted related to the factors affecting the occupational and professionalism performance in the field of agricultural extension agents, in order to integrate the results of research independently carried out to obtain more accurate and more cohesive results. In order to achieve the goal, 177 researches made on the occupational and professional performance (occupational performance, professional performance, professional competence, professional developmentwere collected from sites magiran & sid, and among them, 10 research were selected utilizing tools "check list of technical and methodological research" (including appropriate reliability and validity statistical and sampling correct methodto review and metaanalysis. The results showed that professional features have the greatest impact on the professional and occupational performance of agricultural extension agents. As well, skills and technical competence, the number of field visits, membership of organizations, participation in in-service training courses and access to educational facilities have a high impact on occupational and professional performance. The researchers confirmed these results.

  2. Weeds: a CLASS extension for the analysis of millimeter and sub-millimeter spectral surveys

    CERN Document Server

    Maret, S; Pety, J; Bardeau, S; Reynier, E

    2010-01-01

    The advent of large instantaneous bandwidth receivers and high spectral resolution spectrometers on (sub-)millimeter telescopes has opened up the possibilities for unbiased spectral surveys. Because of the large amount of data they contain, any analysis of these surveys requires dedicated software tools. Here we present an extension of the widely used CLASS software that we developed to that purpose. This extension, named Weeds, allows for searches in atomic and molecular lines databases (e.g. JPL or CDMS) that may be accessed over the internet using a virtual observatory (VO) compliant protocol. The package permits a quick navigation across a spectral survey to search for lines of a given species. Weeds is also capable of modeling a spectrum, as often needed for line identification. We expect that Weeds will be useful for analyzing and interpreting the spectral surveys that will be done with the HIFI instrument on board Herschel, but also observations carried-out with ground based millimeter and sub-millimet...

  3. Link between extension, dyking and subsidence as the reconstruction tool of intraplate rifting mechanism (backstripping data, modelling and geochronology)

    Science.gov (United States)

    Polyansky, Oleg P.; Reverdatto, Vladimir V.; Babichev, Alexey V.

    2014-05-01

    Correlation between subsidence and extension-related magmatism is key in determining mechanism of intracratonic sedimentary basins formation. The total volume of basic sheet intrusions and volcanics within sedimentary rock mass characterizes indirectly the degree of depletion and thinning of the rifted mantle lithosphere. At present the documented features of real-world intracontinental basins show a wide range of parameters characterizing the duration and rate of subsidence, degree of extension/thinning of the lithosphere, age and extent of dyking. For creation of general model of continental rifting it is important to reconstruct an evolution of basins finished at the continental stage, not entered an oceanic spreading phase. One of examples of such structure is the Vilyui sedimentary basin in the eastern Siberian Platform which includes the massive emplacements (10**5 km3) of extrusive and intrusive rocks of the Vilyui large igneous province. We combine backstripping reconstructions of sedimentation and thermal regime during the subsidence with a numerical modelling based on the deformable solid mechanics. It is the first time that the evolution of sedimentation and subsidence which is nonuniform over the basin area has been analyzed for the Vilyui basin. The rift origin of the basin is proved. We estimate the spatial distribution of the parameters of crustal and mantle-lithosphere extension as well as expansion due to dike intrusions. According to the reconstructions, the type of subsidence curves for the sedimentary rocks of the basin depends on the tectonic regime of sedimentation in individual subbasins. The backstripping analysis revealed two stages of extension (sediments 4-5 km thick) and a foreland stage (sediments >2 km thick). With the two-layered lithosphere model, we concluded that the subcrustal layer underwent predominant extension (by a factor of 1.2-2.0 vs. 1.1-1.4 in the crust). In each section, dyke-related extension due to basic intrusion is

  4. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  5. Extensions of positive definite functions applications and their harmonic analysis

    CERN Document Server

    Jorgensen, Palle; Tian, Feng

    2016-01-01

    This monograph deals with the mathematics of extending given partial data-sets obtained from experiments; Experimentalists frequently gather spectral data when the observed data is limited, e.g., by the precision of instruments; or by other limiting external factors. Here the limited information is a restriction, and the extensions take the form of full positive definite function on some prescribed group. It is therefore both an art and a science to produce solid conclusions from restricted or limited data. While the theory of is important in many areas of pure and applied mathematics, it is difficult for students and for the novice to the field, to find accessible presentations which cover all relevant points of view, as well as stressing common ideas and interconnections. We have aimed at filling this gap, and we have stressed hands-on-examples.

  6. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  7. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  8. Muscle functional MRI analysis of trunk muscle recruitment during extension exercises in asymptomatic individuals.

    Science.gov (United States)

    De Ridder, E M D; Van Oosterwijck, J O; Vleeming, A; Vanderstraeten, G G; Danneels, L A

    2015-04-01

    The present study examined the activity levels of the thoracic and lumbar extensor muscles during different extension exercise modalities in healthy individuals. Therefore, 14 subjects performed four different types of extension exercises in prone position: dynamic trunk extension, dynamic-static trunk extension, dynamic leg extension, and dynamic-static leg extension. Pre- and post-exercise muscle functional magnetic resonance imaging scans from the latissimus dorsi, the thoracic and lumbar parts of the longissimus, iliocostalis, and multifidus were performed. Differences in water relaxation values (T2-relaxation) before and after exercise were calculated (T2-shift) as a measure of muscle activity and compared between extension modalities. Linear mixed-model analysis revealed higher lumbar extensor activity during trunk extension compared with leg extension (T2-shift of 5.01 ms and 3.55 ms, respectively) and during the dynamic-static exercise performance compared with the dynamic exercise performance (T2-shift of 4.77 ms and 3.55 ms, respectively). No significant differences in the thoracic extensor activity between the exercises could be demonstrated. During all extension exercises, the latissimus dorsi was the least activated compared with the paraspinal muscles. While all extension exercises are equivalent effective to train the thoracic muscles, trunk extension exercises performed in a dynamic-static way are the most appropriate to enhance lumbar muscle strength.

  9. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  10. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  11. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  12. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  13. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  14. Extensible Data Set Architecture for Systems Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The process of aircraft design requires the integration of data from individual analysis of aerodynamic, structural, thermal, and behavioral properties of a flight...

  15. Genetic analysis of presbycusis by arrayed primer extension.

    Science.gov (United States)

    Rodriguez-Paris, Juan; Ballay, Charles; Inserra, Michelle; Stidham, Katrina; Colen, Tahl; Roberson, Joseph; Gardner, Phyllis; Schrijver, Iris

    2008-01-01

    Using the Hereditary Hearing Loss arrayed primer extension (APEX) array, which contains 198 mutations across 8 hearing loss-associated genes (GJB2, GJB6, GJB3, GJA1, SLC26A4, SLC26A5, 12S-rRNA, and tRNA Ser), we compared the frequency of sequence variants in 94 individuals with early presbycusis to 50 unaffected controls and aimed to identify possible genetic contributors. This cross-sectional study was performed at Stanford University with presbycusis samples from the California Ear Institute. The patients were between ages 20 and 65 yr, with adult-onset sensorineural hearing loss of unknown etiology, and carried a clinical diagnosis of early presbycusis. Exclusion criteria comprised known causes of hearing loss such as significant noise exposure, trauma, ototoxic medication, neoplasm, and congenital infection or syndrome, as well as congenital or pediatric onset. Sequence changes were identified in 11.7% and 10% of presbycusis and control alleles, respectively. Among the presbycusis group, these solely occurred within the GJB2 and SLC26A4 genes. Homozygous and compound heterozygous pathogenic mutations were exclusively seen in affected individuals. We were unable to detect a statistically significant difference between our control and affected populations regarding the frequency of sequence variants detected with the APEX array. Individuals who carry two mild mutations in the GJB2 gene possibly have an increased risk of developing early presbycusis.

  16. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements.

  17. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  18. KIT multi-physics tools for the analysis of design and beyond design basis accidents of light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Victor Hugo; Miassoedov, Alexei; Steinbrueck, M.; Tromm, W. [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany)

    2016-05-15

    This paper describes the KIT numerical simulation tools under extension and validation for the analysis of design and beyond design basis accidents (DBA) of Light Water Reactors (LWR). The description of the complex thermal hydraulic, neutron kinetics and chemo-physical phenomena going on during off-normal conditions requires the development of multi-physics and multi-scale simulations tools which are fostered by the rapid increase in computer power nowadays. The KIT numerical tools for DBA and beyond DBA are validated using experimental data of KIT or from abroad. The developments, extensions, coupling approaches and validation work performed at KIT are shortly outlined and discussed in this paper.

  19. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order......We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... to ensure the finiteness of the protocol state-spaces while still being able to verify interesting protocol properties. The translations for different kinds of communication media have been implemented and successfully tested, among others, on agreement protocols from WS-Business Activity....

  20. Comparative analysis of superintegrons: engineering extensive genetic diversity in the Vibrionaceae.

    Science.gov (United States)

    Rowe-Magnus, Dean A; Guerout, Anne-Marie; Biskri, Latefa; Bouige, Philippe; Mazel, Didier

    2003-03-01

    Integrons are natural tools for bacterial evolution and innovation. Their involvement in the capture and dissemination of antibiotic-resistance genes among Gram-negative bacteria is well documented. Recently, massive ancestral versions, the superintegrons (SIs), were discovered in the genomes of diverse proteobacterial species. SI gene cassettes with an identifiable activity encode proteins related to simple adaptive functions, including resistance, virulence, and metabolic activities, and their recruitment was interpreted as providing the host with an adaptive advantage. Here, we present extensive comparative analysis of SIs identified among the Vibrionaceae. Each was at least 100 kb in size, reaffirming the participation of SIs in the genome plasticity and heterogeneity of these species. Phylogenetic and localization data supported the sedentary nature of the functional integron platform and its coevolution with the host genome. Conversely, comparative analysis of the SI cassettes was indicative of both a wide range of origin for the entrapped genes and of an active cassette assembly process in these bacterial species. The signature attC sites of each species displayed conserved structural characteristics indicating that symmetry rather than sequence was important in the recognition of such a varied collection of target recombination sequences by a single site-specific recombinase. Our discovery of various addiction module cassettes within each of the different SIs indicates a possible role for them in the overall stability of large integron cassette arrays.

  1. Toward an Extension of Decision Analysis to Competitive Situations.

    Science.gov (United States)

    1985-12-01

    Analysis: Models and Resolutions. New York: Elsevier Science Publishing Co.. 23. Goffman , Erving [19609]. Strategic Interaction. Philadelphia: University...have attempted to devise a method for making this determination; we will discuss some of them In this section. Goffman 1969], In a work on strategic...knowledge, and fallibility (resolve, Information state, and resources [ Goffman , 1969]). Porter [1980] calls this category "capabilities" and uses it

  2. The Application and Extension of Backward Software Analysis

    CERN Document Server

    Perisic, Aleksandar

    2010-01-01

    The backward software analysis is a method that emanates from executing a program backwards - instead of taking input data and following the execution path, we start from output data and by executing the program backwards command by command, analyze data that could lead to the current output. The changed perspective forces a developer to think in a new way about the program. It can be applied as a thorough procedure or casual method. With this method, we have many advantages in testing, algorithm and system analysis. For example, in testing the advantage is obvious if the set of output data is smaller than possible inputs. For some programs or algorithms, we know more precisely the output data, so this backward analysis can help in reducing the number of test cases or even in strict verification of an algorithm. The difficulty lies in the fact that we need types of data that no programming language currently supports, so we need additional effort to understand how this method works, or what effort we need to ...

  3. Preliminary analysis of knee stress in Full Extension Landing

    Directory of Open Access Journals (Sweden)

    Majid Davoodi Makinejad

    2013-09-01

    Full Text Available OBJECTIVE: This study provides an experimental and finite element analysis of knee-joint structure during extended-knee landing based on the extracted impact force, and it numerically identifies the contact pressure, stress distribution and possibility of bone-to-bone contact when a subject lands from a safe height. METHODS: The impact time and loads were measured via inverse dynamic analysis of free landing without knee flexion from three different heights (25, 50 and 75 cm, using five subjects with an average body mass index of 18.8. Three-dimensional data were developed from computed tomography scans and were reprocessed with modeling software before being imported and analyzed by finite element analysis software. The whole leg was considered to be a fixed middle-hinged structure, while impact loads were applied to the femur in an upward direction. RESULTS: Straight landing exerted an enormous amount of pressure on the knee joint as a result of the body's inability to utilize the lower extremity muscles, thereby maximizing the threat of injury when the load exceeds the height-safety threshold. CONCLUSIONS: The researchers conclude that extended-knee landing results in serious deformation of the meniscus and cartilage and increases the risk of bone-to-bone contact and serious knee injury when the load exceeds the threshold safety height. This risk is considerably greater than the risk of injury associated with walking downhill or flexion landing activities.

  4. Extension of migration velocity analysis to transmitted wavefields

    Science.gov (United States)

    Lameloise, Charles-Antoine; Chauris, Hervé

    2016-10-01

    Migration velocity analysis aims at automatically updating the large-scale components of the velocity model, called macromodel. Extended Common Image Gathers are panels used to evaluate focusing after imaging and are constructed as a function of a spatial shift introduced in the imaging condition. We investigate how transmitted waves can also be used in migration velocity analysis: instead of back-propagating the residuals associated with reflected waves, we propose to back-propagate the full wavefield. The image function, equivalent to the migrated section for reflected data, does not exhibit localized events in space along horizons but is still sensitive to the choice of the background velocity model and can thus be coupled to the same objective function defined in the image domain. In order to enhance the benefits of direct waves, we consider a cross-well configuration. Direct waves provide a large illumination between two vertical wells. Associated Common Image Gathers present different characteristics than the ones associated with reflected waves in surface acquisition. In particular, energy is spread over up to the maximum penetration depth. We invert cross-well seismic data along two lines. In the first case, the input data contain the full wavefield dominated by transmitted waves. It demonstrates the possibility to handle transmitted waves to determine the velocity model. It appears that the misfit in the data domain is largely reduced after inversion. In the second case, we use the same algorithm, but with reflected observed data only, as in a classical approach. Most of velocity updates are localized around the reflectivity, leading to an incorrect final model. This demonstrates the benefit of transmitted waves for migration velocity analysis in a cross-well configuration.

  5. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  6. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . An example of a prototype for a digital conceptual design tool with integrated real time structural analysis is presented and compared with a more common Building Information Modelling (BIM) approach. It is concluded that a digital conceptual design tool with embedded real time structural analysis could......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  7. A Lexical Analysis Tool with Ambiguity Support

    CERN Document Server

    Quesada, Luis; Cortijo, Francisco J

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  8. Fully Parallel MHD Stability Analysis Tool

    Science.gov (United States)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  9. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    Science.gov (United States)

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design.

  10. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  11. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  12. Healthcare BI: a tool for meaningful analysis.

    Science.gov (United States)

    Rohloff, Rose

    2011-05-01

    Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting.

  13. Does tool use extend peripersonal space? A review and re-analysis.

    Science.gov (United States)

    Holmes, Nicholas P

    2012-04-01

    The fascinating idea that tools become extensions of our body appears in artistic, literary, philosophical, and scientific works alike. In the last 15 years, this idea has been reframed into several related hypotheses, one of which states that tool use extends the neural representation of the multisensory space immediately surrounding the hands (variously termed peripersonal space, peri-hand space, peri-cutaneous space, action space, or near space). This and related hypotheses have been tested extensively in the cognitive neurosciences, with evidence from molecular, neurophysiological, neuroimaging, neuropsychological, and behavioural fields. Here, I briefly review the evidence for and against the hypothesis that tool use extends a neural representation of the space surrounding the hand, concentrating on neurophysiological, neuropsychological, and behavioural evidence. I then provide a re-analysis of data from six published and one unpublished experiments using the crossmodal congruency task to test this hypothesis. While the re-analysis broadly confirms the previously reported finding that tool use does not literally extend peripersonal space, the overall effect sizes are small and statistical power is low. I conclude by questioning whether the crossmodal congruency task can indeed be used to test the hypothesis that tool use modifies peripersonal space.

  14. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  15. Creating an anthropomorphic digital MR phantom—an extensible tool for comparing and evaluating quantitative imaging algorithms

    Science.gov (United States)

    Bosca, Ryan J.; Jackson, Edward F.

    2016-01-01

    Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.

  16. Creating an anthropomorphic digital MR phantom--an extensible tool for comparing and evaluating quantitative imaging algorithms.

    Science.gov (United States)

    Bosca, Ryan J; Jackson, Edward F

    2016-01-21

    Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.

  17. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  18. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  19. Bayesian data analysis tools for atomic physics

    CERN Document Server

    Trassinelli, Martino

    2016-01-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distrib...

  20. Nucleic acid tool enzymes-aided signal amplification strategy for biochemical analysis: status and challenges.

    Science.gov (United States)

    Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli

    2016-04-01

    Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.

  1. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  2. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  3. Multivariate and 2D Extensions of Singular Spectrum Analysis with the Rssa Package

    Directory of Open Access Journals (Sweden)

    Nina Golyandina

    2015-10-01

    Full Text Available Implementation of multivariate and 2D extensions of singular spectrum analysis (SSA by means of the R package Rssa is considered. The extensions include MSSA for simultaneous analysis and forecasting of several time series and 2D-SSA for analysis of digital images. A new extension of 2D-SSA analysis called shaped 2D-SSA is introduced for analysis of images of arbitrary shape, not necessary rectangular. It is shown that implementation of shaped 2D-SSA can serve as a basis for implementation of MSSA and other generalizations. Efficient implementation of operations with Hankel and Hankel-block-Hankel matrices through the fast Fourier transform is suggested. Examples with code fragments in R, which explain the methodology and demonstrate the proper use of Rssa, are presented.

  4. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  5. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  6. Non-extensive analysis of seismicity: application to some seismic sequences of Morocco

    CERN Document Server

    Telesca, Luciano; Rouai, Mohamed

    2011-01-01

    The magnitude distribution of three seismic sequences occurred in Morocco were investigated by means of the Tsallis-based non-extensive analysis. The non-extensive parameters were estimated by means of the Levenberg-Marquadt nonlinear least square fitting method. It was found that the q value could be a good indicator of the complexity of seismic phenomena. Such findings could contribute in better understanding the dynamics of seismicity and suggesting a unifying view of earthquake occurrence.

  7. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  8. Serial concept maps: tools for concept analysis.

    Science.gov (United States)

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments.

  9. Multidimensional analysis: a management tool for monitoring HIPAA compliance and departmental performance.

    Science.gov (United States)

    Coleman, Robert M; Ralston, Matthew D; Szafran, Alexander; Beaulieu, David M

    2004-09-01

    Most RIS and PACS systems include extensive auditing capabilities as part of their security model, but inspecting those audit logs to obtain useful information can be a daunting task. Manual analysis of audit trails, though cumbersome, is often resorted to because of the difficulty to construct queries to extract complex information from the audit logs. The approach proposed by the authors uses standard off-the-shelf multidimensional analysis software tools to assist the PACS/RIS administrator and/or security officer in analyzing those audit logs to identify and scrutinize suspicious events. Large amounts of data can be quickly reviewed and graphical analysis tools help explore system utilization. While additional efforts are required to fully satisfy the demands of the ever-increasing security and confidentiality pressures, multidimensional analysis tools are a practical step toward actually using the information that is already being captured in the systems' audit logs. In addition, once the work is performed to capture and manipulate the audit logs into a viable format for the multidimensional analysis tool, it is relatively easy to extend the system to incorporate other pertinent data, thereby enabling the ongoing analysis of other aspects of the department's workflow.

  10. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  11. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  12. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  13. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  14. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  15. Single-cell analysis tools for drug discovery and development.

    Science.gov (United States)

    Heath, James R; Ribas, Antoni; Mischel, Paul S

    2016-03-01

    The genetic, functional or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development. Such heterogeneity hinders the design of accurate disease models and can confound the interpretation of biomarker levels and of patient responses to specific therapies. The complex nature of virtually all tissues has motivated the development of tools for single-cell genomic, transcriptomic and multiplex proteomic analyses. Here, we review these tools and assess their advantages and limitations. Emerging applications of single cell analysis tools in drug discovery and development, particularly in the field of oncology, are discussed.

  16. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  17. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  18. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  19. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  20. [SIGAPS, a tool for the analysis of scientific publications].

    Science.gov (United States)

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  1. Using the MoodleReader as an Extensive Reading Tool and Its Effect on Iranian EFL Students' Incidental Vocabulary Learning

    Science.gov (United States)

    Alavi, Sepideh; Keyvanshekouh, Afsaneh

    2012-01-01

    The present study focused on using the MoodleReader to promote extensive reading (ER) in an Iranian EFL context, emphasizing its effect on students' incidental vocabulary acquisition. Thirty eight Shiraz University sophomores were assigned to experimental and control groups. The experimental group used the MoodleReader for their ER program, while…

  2. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 12: PEP data item descriptions

    Science.gov (United States)

    1979-01-01

    Contractor information requirements necessary to support the power extension package project of the space shuttle program are specified for the following categories of data: project management; configuration management; systems engineering and test; manufacturing; reliability, quality assurance and safety; logistics; training; and operations.

  3. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 4: PEP functional specification

    Science.gov (United States)

    1979-01-01

    The functional, performance, design, and test requirements for the Orbiter power extension package and its associated ground support equipment are defined. Both government and nongovernment standards and specifications are cited for the following subsystems: electrical power, structural/mechanical, avionics, and thermal control. Quality control assurance provisions and preparation for delivery are also discussed.

  4. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 5: PEP environmental specification

    Science.gov (United States)

    1979-01-01

    This specification establishes the natural and induced environments to which the power extension package may be exposed during ground operations and space operations with the shuttle system. Space induced environments are applicable at the Orbiter attach point interface location. All probable environments are systematically listed according to each ground and mission phase.

  5. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  6. Analysis of Characteristics Extension Workers to Utilization of Information and Communication Technology

    Directory of Open Access Journals (Sweden)

    Veronice Veronice

    2015-08-01

    Full Text Available The science and technology is developing rapidly with the demands of changing times. The development of information and communication technology, especially since the advent of internet technology has led to major changes in society. Information technology products are relatively cheap and affordable facilitate access to information beyond the national borders and cultural boundaries. This condition has penetrated to all levels of human life, including farmers in the villages. Therefore, the extension becomes important role as a facilitator in developing the potential of farmers. Consequently extension is required to adjust to the changes and demands of the growing community. The objectives of the research is the analysis of characteristics extension workers to utilization of information and communication technology in Limapuluh Kota regency West Sumatera. This study is a descriptive-correlational survey-based study with the sample consisting of government employee as well as freelance extension workers in 8 Extension Agency of Agriculture  Fisheries and Forestry Extension (BP3K in Limapuluh Kota regency, West Sumatera province. Based on the results obtained, the results of different test (t-test is known that there are significant differences between the characteristics of the civil servants and THL-TBPP especially in the aspect of age and length of employment.

  7. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    Science.gov (United States)

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  8. GEPAS, a web-based tool for microarray data analysis and interpretation

    Science.gov (United States)

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  9. SigMate: a Matlab-based automated tool for extracellular neuronal signal processing and analysis.

    Science.gov (United States)

    Mahmud, Mufti; Bertoldo, Alessandra; Girardi, Stefano; Maschietto, Marta; Vassanelli, Stefano

    2012-05-30

    Rapid advances in neuronal probe technology for multisite recording of brain activity have posed a significant challenge to neuroscientists for processing and analyzing the recorded signals. To be able to infer meaningful conclusions quickly and accurately from large datasets, automated and sophisticated signal processing and analysis tools are required. This paper presents a Matlab-based novel tool, "SigMate", incorporating standard methods to analyze spikes and EEG signals, and in-house solutions for local field potentials (LFPs) analysis. Available modules at present are - 1. In-house developed algorithms for: data display (2D and 3D), file operations (file splitting, file concatenation, and file column rearranging), baseline correction, slow stimulus artifact removal, noise characterization and signal quality assessment, current source density (CSD) analysis, latency estimation from LFPs and CSDs, determination of cortical layer activation order using LFPs and CSDs, and single LFP clustering; 2. Existing modules: spike detection, sorting and spike train analysis, and EEG signal analysis. SigMate has the flexibility of analyzing multichannel signals as well as signals from multiple recording sources. The in-house developed tools for LFP analysis have been extensively tested with signals recorded using standard extracellular recording electrode, and planar and implantable multi transistor array (MTA) based neural probes. SigMate will be disseminated shortly to the neuroscience community under the open-source GNU-General Public License.

  10. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  11. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects...... and designers....

  12. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  13. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  14. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  15. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  16. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...

  17. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    Science.gov (United States)

    Clough, D.; Fletcher, S.; Longstaff, A. P.; Willoughby, P.

    2012-05-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  18. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  19. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  20. On-line Tools for Solar Data Compiled at the Debrecen Observatory and Their Extensions with the Greenwich Sunspot Data

    Science.gov (United States)

    Baranyi, T.; Győri, L.; Ludmány, A.

    2016-08-01

    The primary task of the Debrecen Heliophysical Observatory (DHO) has been the most detailed, reliable, and precise documentation of the solar photospheric activity since 1958. This long-term effort resulted in various solar catalogs based on ground-based and space-borne observations. A series of sunspot databases and on-line tools were compiled at DHO: the Debrecen Photoheliographic Data (DPD, 1974 -), the dataset based on the Michelson Doppler Imager (MDI) of the Solar and Heliospheric Observatory (SOHO) called SOHO/MDI-Debrecen Data (SDD, 1996 - 2010), and the dataset based on the Helioseismic and Magnetic Imager (HMI) of the Solar Dynamics Observatory (SDO) called SDO/HMI-Debrecen Data (HMIDD, 2010 - ). User-friendly web-presentations and on-line tools were developed to visualize and search data. As a last step of the compilation, the revised version of Greenwich Photoheliographic Results (GPR, 1874 - 1976) catalog was converted to DPD format, and a homogeneous sunspot database covering more than 140 years was created. The database of images for the GPR era was completed with the full-disc drawings of the Hungarian historical observatories Ógyalla and Kalocsa (1872 - 1919) and with the polarity drawings of Mount Wilson Observatory. We describe the main characteristics of the available data and on-line tools.

  1. On-line Tools for Solar Data Compiled at the Debrecen Observatory and Their Extensions with the Greenwich Sunspot Data

    Science.gov (United States)

    Baranyi, T.; Győri, L.; Ludmány, A.

    2016-11-01

    The primary task of the Debrecen Heliophysical Observatory (DHO) has been the most detailed, reliable, and precise documentation of the solar photospheric activity since 1958. This long-term effort resulted in various solar catalogs based on ground-based and space-borne observations. A series of sunspot databases and on-line tools were compiled at DHO: the Debrecen Photoheliographic Data (DPD, 1974 -), the dataset based on the Michelson Doppler Imager (MDI) of the Solar and Heliospheric Observatory (SOHO) called SOHO/MDI-Debrecen Data (SDD, 1996 - 2010), and the dataset based on the Helioseismic and Magnetic Imager (HMI) of the Solar Dynamics Observatory (SDO) called SDO/HMI-Debrecen Data (HMIDD, 2010 - ). User-friendly web-presentations and on-line tools were developed to visualize and search data. As a last step of the compilation, the revised version of Greenwich Photoheliographic Results (GPR, 1874 - 1976) catalog was converted to DPD format, and a homogeneous sunspot database covering more than 140 years was created. The database of images for the GPR era was completed with the full-disc drawings of the Hungarian historical observatories Ógyalla and Kalocsa (1872 - 1919) and with the polarity drawings of Mount Wilson Observatory. We describe the main characteristics of the available data and on-line tools.

  2. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    Science.gov (United States)

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  3. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  4. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  5. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  6. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  7. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  8. On-line Tools for Solar Data Compiled in the Debrecen Observatory and their Extensions with the Greenwich Sunspot Data

    CERN Document Server

    Baranyi, T; Ludmán, A

    2016-01-01

    The primary task of the Debrecen Heliophysical Observatory (DHO) has been the most detailed, reliable, and precise documentation of the solar photospheric activity since 1958. This long-term effort resulted in various solar catalogs based on ground-based and space-borne observations. A series of sunspot databases and on-line tools were compiled at DHO: the Debrecen Photoheliographic Data (DPD, 1974--), the dataset based on the {\\it Michelson Doppler Imager} (MDI) of the {\\it Solar and Heliospheric Observatory} (SOHO) called SOHO/MDI--Debrecen Data (SDD, 1996--2010), and the dataset based on the {\\it Helioseismic and Magnetic Imager} (HMI) of the {\\it Solar Dynamics Observatory} (SDO) called SDO/HMI--Debrecen Data (HMIDD, 2010--). User-friendly web-presentations and on-line tools were developed to visualize and search data. As a last step of compilation, the revised version of Greenwich Photoheliographic Results (GPR, 1874--1976) catalog was converted to DPD format, and a homogeneous sunspot database covering ...

  9. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  10. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 1: Executive summary

    Science.gov (United States)

    1979-01-01

    An array deployment assembly, power regulation and control assembly, the necessary interface, and display and control equipment comprise the power extension package (PEP) which is designed to provide increased power and duration, as well as reduce fuel cell cryogen consumption during Spacelab missions. Compatible with all currently defined missions and payloads, PEP imposes minimal weight and volume penalties on sortie missions, and can be installed and removed as needed at the launch site within the normal Orbiter turnaround cycle. The technology on which it is based consists of a modified solar electric propulsion array, standard design regulator and control equipment, and a minimally modified Orbiter design. The requirements from which PEP was derived, and the system and its performance capabilities are described. Features of the recommended project are presented.

  11. AstroStat - A VO Tool for Statistical Analysis

    CERN Document Server

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  12. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  13. TMVA - Tool-kit for Multivariate Data Analysis in ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Therhaag, Jan; Von Toerne, Eckhard [Univ. Bonn, Physikalisches Institut, Nussallee 12, 53115 Bonn (Germany); Hoecker, Andreas; Speckmayer, Peter [European Organization for Nuclear Research - CERN, CH-1211 Geneve 23 (Switzerland); Stelzer, Joerg [Deutsches Elektronen-Synchrotron - DESY, Platanenallee 6, D-15738 Zeuthen (Germany); Voss, Helge [Max-Planck-Institut fuer Kernphysik - MPI, Postfach 10 39 80, Saupfercheckweg 1, DE-69117 Heidelberg (Germany)

    2010-07-01

    Given the ever-increasing complexity of modern HEP data analysis, multivariate analysis techniques have proven an indispensable tool in extracting the most valuable information from the data. TMVA, the Tool-kit for Multivariate Data Analysis, provides a large variety of advanced multivariate analysis techniques for both signal/background classification and regression problems. In TMVA, all methods are embedded in a user-friendly framework capable of handling the pre-processing of the data as well as the evaluation of the results, thus allowing for a simple use of even the most sophisticated multivariate techniques. Convenient assessment and comparison of different analysis techniques enable the user to choose the most efficient approach for any particular data analysis task. TMVA is an integral part of the ROOT data analysis framework and is widely-used in the LHC experiments. In this talk I will review recent developments in TMVA, discuss typical use-cases in HEP and present the performance of our most important multivariate techniques on example data by comparing it to theoretical performance limits. (authors)

  14. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  15. Multijoint kinetic chain analysis of knee extension during the soccer instep kick.

    Science.gov (United States)

    Naito, Kozo; Fukui, Yosuke; Maruyama, Takeo

    2010-04-01

    Although previous studies have shown that motion-dependent interactions between adjacent segments play an important role in producing knee extension during the soccer instep kick, detailed knowledge about the mechanisms underlying those interactions is lacking. The present study aimed to develop a 3-D dynamical model for the multijoint kinetic chain of the instep kick in order to quantify the contributions of the causal dynamical factors to the production of maximum angular velocity during knee extension. Nine collegiate soccer players volunteered to participate in the experiment and performed instep kicking movements while 3-D positional data and the ground reaction force were measured. A dynamical model was developed in the form of a linked system containing 8 segments and 18 joint rotations, and the knee extension/flexion motion was decomposed into causal factors related to muscular moment, gyroscopic moment, centrifugal force, Coriolis force, gravity, proximal endpoint linear acceleration, and external force-dependent terms. The rapid knee extension during instep kicking was found to result almost entirely from kicking leg centrifugal force, trunk rotation muscular moment, kicking leg Coriolis force, and trunk rotation gyroscopic-dependent components. Based on the finding that rapid knee extension during instep kicking stems from multiple dynamical factors, it is suggested that the multijoint kinetic chain analysis used in the present study is more useful for achieving a detailed understanding of the cause of rapid kicking leg movement than the previously used 2-D, two-segment kinetic chain model. The present results also indicated that the centrifugal effect due to the kicking hip flexion angular velocity contributed substantially to the generation of a rapid knee extension, suggesting that the adjustment between the kicking hip flexion angular velocity and the leg configuration (knee flexion angle) is more important for effective instep kicking than other

  16. A review of ADM1 extensions, applications, and analysis 2002-2005

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Keller, J.; Steyer, J.-P.

    2006-01-01

    applications such as distributed parameter modelling of biofilms. The key limitations for anaerobic modelling originally identified in the STIR were: (i) regulation of products from glucose fermentation, (ii) parameter values, and variability, and (iii) specific extensions. Parameter analysis has been...... as a virtual industrial system can stimulate modelling of anaerobic processes by researchers and practitioners outside the core expertise of anaerobic processes. It has been used as a default structural element that allows researchers to concentrate on new extensions such,as sulfate reduction, and new...... hydrogen production from carbohydrate-type waste. Critical analysis of the model has mainly focused on model structure reduction, hydrogen inhibition functions, and the default parameter set recommended in the STIR. This default parameter set has largely been verified as a reasonable compromise, especially...

  17. Physics Analysis Tools for the CMS experiment at LHC

    CERN Document Server

    Fabozzi, Francesco; Hegner, Benedikt; Lista, Luca

    2008-01-01

    The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.

  18. SABRE: A Tool for Stochastic Analysis of Biochemical Reaction Networks

    CERN Document Server

    Didier, Frederic; Mateescu, Maria; Wolf, Verena

    2010-01-01

    The importance of stochasticity within biological systems has been shown repeatedly during the last years and has raised the need for efficient stochastic tools. We present SABRE, a tool for stochastic analysis of biochemical reaction networks. SABRE implements fast adaptive uniformization (FAU), a direct numerical approximation algorithm for computing transient solutions of biochemical reaction networks. Biochemical reactions networks represent biological systems studied at a molecular level and these reactions can be modeled as transitions of a Markov chain. SABRE accepts as input the formalism of guarded commands, which it interprets either as continuous-time or as discrete-time Markov chains. Besides operating in a stochastic mode, SABRE may also perform a deterministic analysis by directly computing a mean-field approximation of the system under study. We illustrate the different functionalities of SABRE by means of biological case studies.

  19. VO-Dance an IVOA tools to easy publish data into VO and it's extension on planetology request

    Science.gov (United States)

    Smareglia, R.; Capria, M. T.; Molinaro, M.

    2012-09-01

    Data publishing through the self standing portals can be joined to VO resource publishing, i.e. astronomical resources deployed through VO compliant services. Since the IVOA (International Virtual Observatory Alliance) provides many protocols and standards for the various data flavors (images, spectra, catalogues … ), and since the data center has as a goal to grow up in number of hosted archives and services providing, the idea arose to find a way to easily deploy and maintain VO resources. VO-Dance is a java web application developed at IA2 that addresses this idea creating, in a dynamical way, VO resources out of database tables or views. It is structured to be potentially DBMS and platform independent and consists of 3 main tokens, an internal DB to store resources description and model metadata information, a restful web application to deploy the resources to the VO community. It's extension to planetology request is under study to best effort INAF software development and archive efficiency.

  20. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    Science.gov (United States)

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  1. Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory

    CERN Document Server

    Sena, I

    2012-01-01

    A systematic analysis of transverse momentum distribution of hadrons produced in ultra-relativistic $p+p$ and $A+A$ collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.

  2. COMPARISON OF MALAYSIA MANUFACTURING COMPANIES BY FINANCIAL STATEMENT ANALYSIS TOOLS

    OpenAIRE

    MALEK, Afagh; Mohammadi, Maryam; NASSIRI, Fardokht

    2012-01-01

    One of the best ways to get the expected results from trading in the stock market is to acquire a good evaluation of companies’ performance. Similarly, this study aims at comparing the financial performance of Lb Aluminium Berhad and Seal Incorporated Berhad manufacturing companies, which are listed in the main market of Malaysian stock exchange. The data were gathered from the annual reports of companies during last three years and analysed by financial statement analysis tools, which are ...

  3. Ethics Auditing and Conflict Analysis as Management Tools

    OpenAIRE

    2008-01-01

    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  4. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  5. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  6. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  7. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  8. Pathway-based analysis tools for complex diseases: a review.

    Science.gov (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

    2014-10-01

    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  9. Pathway-based Analysis Tools for Complex Diseases: A Review

    Directory of Open Access Journals (Sweden)

    Lv Jin

    2014-10-01

    Full Text Available Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods—the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  10. Shot planning and analysis tools on the NIF project

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, R. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Casey, A., E-mail: casey20@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Conder, A.; Fallejo, R.; Flegel, M.; Hutton, M.; Jancaitis, K.; Lakamsani, V.; Potter, D.; Reisdorf, S.; Tappero, J.; Whitman, P.; Carr, W.; Liao, Z. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Target shots in NIF, dozens a month, vary widely in laser and target configuration. Black-Right-Pointing-Pointer A planning tool helps select shot sequences that optimize valuable facility time. Black-Right-Pointing-Pointer Fabrication and supply of targets, diagnostics, etc. are integrated into the plan. Black-Right-Pointing-Pointer Predictive modeling of aging parts (e.g., optics) aids maintenance decision support. Black-Right-Pointing-Pointer We describe the planning/analysis tool and its use in NIF experimental operations. - Abstract: Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Campaign Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modeling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics

  11. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  12. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2016-03-04

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  13. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  14. Graphical tools for network meta-analysis in STATA.

    Science.gov (United States)

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  15. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  16. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  17. Use of Grid Tools to Support CMS Distributed Analysis

    CERN Document Server

    Fanfani, A; Anjum, A; Barrass, T; Bonacorsi, D; Bunn, J; Corvo, M; Darmenov, N; De Filippis, N; Donno, F; Donvito, G; Eulisse, G; Fanzago, F; Filine, A; Grandi, C; Hernández, J M; Innocente, V; Jan, A; Lacaprara, S; Legrand, I; Metson, S; Newman, H; Silvestris, L; Steenberg, C; Stockinger, H; Taylor, L; Thomas, M; Tuura, L; Van Lingen, F; Wildish, T

    2004-01-01

    In order to prepare the Physic Technical Design Report, due by end of 2005, the CMS experiment needs to simulate, reconstruct and anlayse about 100 million events, corresponding to more than 200 TB of data. The data will be distributed to several Computing Centres. In order to provide access to the whole data sample to all the world-wide dispersed physicists, CMS is developing a layer of software that uses the grid tools provided by the LCG project to gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in the framework of the ARDA project. This work describes the current status and the future development...

  18. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  19. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  20. Extensions to the Systematic Error and Risk Analysis (SERA) Software Tool

    Science.gov (United States)

    2004-02-05

    texte enregistré sur un enregistreur de voix Sony IC et transcrit par le logiciel Dragon Naturally Speaking. Le logiciel SERA gère l’insertion du texte...enrichi d’une fonction qui accepte un texte enregistré sur un enregistreur de voix Sony IC et transcrit par le logiciel Dragon Naturally Speaking. Des

  1. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  2. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  3. FC-NIRS: A Functional Connectivity Analysis Tool for Near-Infrared Spectroscopy Data

    Directory of Open Access Journals (Sweden)

    Jingping Xu

    2015-01-01

    Full Text Available Functional near-infrared spectroscopy (fNIRS, a promising noninvasive imaging technique, has recently become an increasingly popular tool in resting-state brain functional connectivity (FC studies. However, the corresponding software packages for FC analysis are still lacking. To facilitate fNIRS-based human functional connectome studies, we developed a MATLAB software package called “functional connectivity analysis tool for near-infrared spectroscopy data” (FC-NIRS. This package includes the main functions of fNIRS data preprocessing, quality control, FC calculation, and network analysis. Because this software has a friendly graphical user interface (GUI, FC-NIRS allows researchers to perform data analysis in an easy, flexible, and quick way. Furthermore, FC-NIRS can accomplish batch processing during data processing and analysis, thereby greatly reducing the time cost of addressing a large number of datasets. Extensive experimental results using real human brain imaging confirm the viability of the toolbox. This novel toolbox is expected to substantially facilitate fNIRS-data-based human functional connectome studies.

  4. GLIDER: Free tool imagery data visualization, analysis and mining

    Science.gov (United States)

    Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.

    2009-12-01

    Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis

  5. Modal interval analysis new tools for numerical information

    CERN Document Server

    Sainz, Miguel A; Calm, Remei; Herrero, Pau; Jorba, Lambert; Vehi, Josep

    2014-01-01

    This book presents an innovative new approach to interval analysis. Modal Interval Analysis (MIA) is an attempt to go beyond the limitations of classic intervals in terms of their structural, algebraic and logical features. The starting point of MIA is quite simple: It consists in defining a modal interval that attaches a quantifier to a classical interval and in introducing the basic relation of inclusion between modal intervals by means of the inclusion of the sets of predicates they accept. This modal approach introduces interval extensions of the real continuous functions, identifies equivalences between logical formulas and interval inclusions, and provides the semantic theorems that justify these equivalences, along with guidelines for arriving at these inclusions. Applications of these equivalences in different areas illustrate the obtained results. The book also presents a new interval object: marks, which aspire to be a new form of numerical treatment of errors in measurements and computations.

  6. An open source tool for heart rate variability spectral analysis.

    Science.gov (United States)

    Rodríguez-Liñares, L; Méndez, A J; Lado, M J; Olivieri, D N; Vila, X A; Gómez-Conde, I

    2011-07-01

    In this paper we describe a software package for developing heart rate variability analysis. This package, called RHRV, is a third party extension for the open source statistical environment R, and can be freely downloaded from the R-CRAN repository. We review the state of the art of software related to the analysis of heart rate variability (HRV). Based upon this review, we motivate the development of an open source software platform which can be used for developing new algorithms for studying HRV or for performing clinical experiments. In particular, we show how the RHRV package greatly simplifies and accelerates the work of the computer scientist or medical specialist in the HRV field. We illustrate the utility of our package with practical examples.

  7. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  8. Basic statistical tools in research and data analysis

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  9. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  10. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  11. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  12. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  13. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  15. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  16. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    Science.gov (United States)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  17. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  18. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  19. 3D-Aided-Analysis Tool for Lunar Rover

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; LI Guo-peng; REN Xin; LIU Jian-jun; GAO Xing-ye; ZOU Xiao-duan

    2013-01-01

    3D-Aided-Analysis Tool (3DAAT) which is a virtual reality system is built up in this paper. 3DAAT is integrated with kinematics and dynamics model of rover as well as real lunar surface terrain mode. Methods of modeling which are proposed in this paper include constructing lunar surface, constructing 3D model of lander and rover, building up kinematic model of rover body. Photogrammetry technique and the remote sensing information are used to generate the terrain model of lunar surface. According to the implementation result, 3DAAT is an effective assist system for making exploration plan and analyzing the status of rover.

  20. Gene Knockout Identification Using an Extension of Bees Hill Flux Balance Analysis

    Directory of Open Access Journals (Sweden)

    Yee Wen Choon

    2015-01-01

    Full Text Available Microbial strain optimisation for the overproduction of a desired phenotype has been a popular topic in recent years. Gene knockout is a genetic engineering technique that can modify the metabolism of microbial cells to obtain desirable phenotypes. Optimisation algorithms have been developed to identify the effects of gene knockout. However, the complexities of metabolic networks have made the process of identifying the effects of genetic modification on desirable phenotypes challenging. Furthermore, a vast number of reactions in cellular metabolism often lead to a combinatorial problem in obtaining optimal gene knockout. The computational time increases exponentially as the size of the problem increases. This work reports an extension of Bees Hill Flux Balance Analysis (BHFBA to identify optimal gene knockouts to maximise the production yield of desired phenotypes while sustaining the growth rate. This proposed method functions by integrating OptKnock into BHFBA for validating the results automatically. The results show that the extension of BHFBA is suitable, reliable, and applicable in predicting gene knockout. Through several experiments conducted on Escherichia coli, Bacillus subtilis, and Clostridium thermocellum as model organisms, extension of BHFBA has shown better performance in terms of computational time, stability, growth rate, and production yield of desired phenotypes.

  1. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  2. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  3. Extensions to DSD theory: Analysis of PBX 9502 rate stick data

    Energy Technology Data Exchange (ETDEWEB)

    Aslam, T.D.; Bdzil, J.B.; Hill, L.G.

    1998-12-31

    Recent extensions to DSD theory and modeling argue that the intrinsic front propagation law can depend on variables in addition to the total shock-front curvature. Here the authors outline this work and present results of high-resolution numerical simulations of 2D detonation that verify the theory on some points, but disagree with it on others. Chief among these is the verification of the extended propagation laws and the observation that the curvature is infinite at the HE boundary. The authors discuss how these results impact the analysis of PBX 9502.

  4. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.

    Science.gov (United States)

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  5. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  6. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  7. PyRAT (python radiography analysis tool): overview

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  8. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  9. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  10. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  11. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  12. Online tools for polyphasic analysis of Mycobacterium tuberculosis complex genotyping data: now and next.

    Science.gov (United States)

    Weniger, Thomas; Krawczyk, Justina; Supply, Philip; Harmsen, Dag; Niemann, Stefan

    2012-06-01

    Molecular diagnostics and genotyping of pathogens have become indispensable tools in clinical microbiology and disease surveillance. For isolates of the Mycobacterium tuberculosis complex (MTBC, causative agents of tuberculosis), multilocus variable number tandem repeat analysis (MLVA) targeting mycobacterial interspersed repetitive units (MIRU) has been internationally adopted as the new standard, portable, reproducible, and discriminatory typing method. Here, we review new sets of specialized web based bioinformatics tools that have become available for analyzing MLVA data especially in combination with other, complementary genotyping markers (polyphasic analysis). Currently, there are only two databases available that are not restricted to store one kind of genotyping data only, namely SITVIT/SpolDB4 and MIRU-VNTRplus. SITVIT/SpolDB4 (http://www.pasteur-guadeloupe.fr:8081/SITVITDemo) contains spoligotyping data from a large number of strains of diverse origin. However, besides options to query the data, the actual version of SITVIT/SpolDB4 offers no functionality for more complex analysis e.g. tree-based analysis. In comparison, the MIRU-VNTRplus web application (http://www.miru-vntrplus.org), represents a freely accessible service that enables users to analyze genotyping data of their strains alone or in comparison with a currently limited but well characterized reference database of strains representing the major MTBC lineages. Data (MLVA-, spoligotype-, large sequence polymorphism, and single nucleotide polymorphism) can be visualized and analyzed using just one genotyping method or a weighted combination of several markers. A variety of analysis tools are available such as creation of phylogenetic and minimum spanning trees, semi-automated phylogenetic lineage identification based on comparison with the reference database and mapping of geographic information. To facilitate scientific communication, a universal, expanding genotype nomenclature (MLVA MtbC15

  13. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  14. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  15. CRITICA: coding region identification tool invoking comparative analysis

    Science.gov (United States)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  16. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  17. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-03

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  18. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    Science.gov (United States)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  19. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  20. Use of stakeholder analysis to inform risk communication and extension strategies for improved biosecurity amongst small-scale pig producers.

    Science.gov (United States)

    Hernández-Jover, M; Gilmour, J; Schembri, N; Sysak, T; Holyoake, P K; Beilin, R; Toribio, J-A L M L

    2012-05-01

    Extension and communication needs amongst small-scale pig producers, described as pig producers with less than 100 sows, have been previously identified. These producers, who are believed to pose a biosecurity risk to commercial livestock industries, are characterized by a lack of formal networks, mistrust of authorities, poor disease reporting behaviour and motivational diversity, and reliance on other producers, veterinarians and family for pig health and production advice. This paper applies stakeholder identification and analysis tools to determine stakeholders' influence and interest on pig producers' practices. Findings can inform a risk communication process and the development of an extension framework to increase producers' engagement with industry and their compliance with biosecurity standards and legislation in Australia. The process included identification of stakeholders, their issues of concerns regarding small-scale pig producers and biosecurity and their influence and interest in each of these issues. This exercise identified the capacity of different stakeholders to influence the outcomes for each issue and assessed their success or failure to do so. The disconnection identified between the level of interest and influence suggests that government and industry need to work with the small-scale pig producers and with those who have the capacity to influence them. Successful biosecurity risk management will depend on shared responsibility and building trust amongst stakeholders. Flow-on effects may include legitimating the importance of reporting and compliance systems and the co-management of risk. Compliance of small-scale pig producers with biosecurity industry standards and legislation will reduce the risks of entry and spread of exotic diseases in Australia.

  1. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  2. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  3. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  4. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  5. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  6. Criterion-Related Validity of Sit-and-Reach Tests for Estimating Hamstring and Lumbar Extensibility: a Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Daniel Mayorga-Vega

    2014-03-01

    Full Text Available The main purpose of the present meta-analysis was to examine the scientific literature on the criterion-related validity of sit-and-reach tests for estimating hamstring and lumbar extensibility. For this purpose relevant studies were searched from seven electronic databases dated up through December 2012. Primary outcomes of criterion-related validity were Pearson´s zero-order correlation coefficients (r between sit-and-reach tests and hamstrings and/or lumbar extensibility criterion measures. Then, from the included studies, the Hunter- Schmidt´s psychometric meta-analysis approach was conducted to estimate population criterion- related validity of sit-and-reach tests. Firstly, the corrected correlation mean (rp, unaffected by statistical artefacts (i.e., sampling error and measurement error, was calculated separately for each sit-and-reach test. Subsequently, the three potential moderator variables (sex of participants, age of participants, and level of hamstring extensibility were examined by a partially hierarchical analysis. Of the 34 studies included in the present meta-analysis, 99 correlations values across eight sit-and-reach tests and 51 across seven sit-and-reach tests were retrieved for hamstring and lumbar extensibility, respectively. The overall results showed that all sit-and-reach tests had a moderate mean criterion-related validity for estimating hamstring extensibility (rp = 0.46-0.67, but they had a low mean for estimating lumbar extensibility (rp = 0. 16-0.35. Generally, females, adults and participants with high levels of hamstring extensibility tended to have greater mean values of criterion-related validity for estimating hamstring extensibility. When the use of angular tests is limited such as in a school setting or in large scale studies, scientists and practitioners could use the sit-and-reach tests as a useful alternative for hamstring extensibility estimation, but not for estimating lumbar extensibility.

  7. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  8. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  9. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  10. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  11. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  12. Extensive next-generation sequencing analysis in chronic lymphocytic leukemia at diagnosis: clinical and biological correlations

    Directory of Open Access Journals (Sweden)

    Gian Matteo Rigolin

    2016-09-01

    Full Text Available Abstract Background In chronic lymphocytic leukemia (CLL, next-generation sequencing (NGS analysis represents a sensitive, reproducible, and resource-efficient technique for routine screening of gene mutations. Methods We performed an extensive biologic characterization of newly diagnosed CLL, including NGS analysis of 20 genes frequently mutated in CLL and karyotype analysis to assess whether NGS and karyotype results could be of clinical relevance in the refinement of prognosis and assessment of risk of progression. The genomic DNA from peripheral blood samples of 200 consecutive CLL patients was analyzed using Ion Torrent Personal Genome Machine, a NGS platform that uses semiconductor sequencing technology. Karyotype analysis was performed using efficient mitogens. Results Mutations were detected in 42.0 % of cases with 42.8 % of mutated patients presenting 2 or more mutations. The presence of mutations by NGS was associated with unmutated IGHV gene (p = 0.009, CD38 positivity (p = 0.010, risk stratification by fluorescence in situ hybridization (FISH (p < 0.001, and the complex karyotype (p = 0.003. A high risk as assessed by FISH analysis was associated with mutations affecting TP53 (p = 0.012, BIRC3 (p = 0.003, and FBXW7 (p = 0.003 while the complex karyotype was significantly associated with TP53, ATM, and MYD88 mutations (p = 0.003, 0.018, and 0.001, respectively. By multivariate analysis, the multi-hit profile (≥2 mutations by NGS was independently associated with a shorter time to first treatment (p = 0.004 along with TP53 disruption (p = 0.040, IGHV unmutated status (p < 0.001, and advanced stage (p < 0.001. Advanced stage (p = 0.010, TP53 disruption (p < 0.001, IGHV unmutated status (p = 0.020, and the complex karyotype (p = 0.007 were independently associated with a shorter overall survival. Conclusions At diagnosis, an extensive biologic characterization including

  13. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  14. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  15. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  16. Multi-tool design and analysis of an automotive HUD

    Science.gov (United States)

    Irving, Bruce; Hasenauer, David; Mulder, Steve

    2016-10-01

    Design and analysis of an optical system is often a multidisciplinary task, and can involve the use of specialized software packages for imaging, mechanics, and illumination. This paper will present a case study on the design and analysis of a basic heads-up display (HUD) for automotive use. The emphasis will be on the special requirements of a HUD visual system and on the tools and techniques needed to accomplish the design. The first section of this paper will present an overview of the imaging design using commercially available imaging design software. Topics addressed in this section include modeling the windshield, visualizing the imaging performance, using constraints and freeform surfaces to improve the system, and meeting specific visual performance specifications with design/analysis methods. The second section will address the use of a CAD program to design a basic mechanical structure to support and protect the optics. This section will also discuss some of the issues and limitations involved in translating data between a CAD program and a lens design or illumination program. Typical issues that arise include the precision of optical surface prescriptions, surface and material properties, and the management of large data files. In the final section, the combined optical and mechanical package will be considered, using an illumination design program for stray light analysis. The stray light analysis will be directed primarily toward finding, visualizing, and quantifying unexpected ray paths. Techniques for sorting optical ray paths by path length, power, and elements or materials encountered will be discussed, along with methods for estimating the impact of stray light on the optical system performance.

  17. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Jeremy [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  18. Analysis of search-extension method for finding multiple solutions of nonlinear problem

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    For numerical computations of multiple solutions of the nonlinear elliptic problemΔu+ f(u)=0 inΩ, u=0 onΓ, a search-extension method (SEM) was proposed and systematically studied by the authors. This paper shall complete its theoretical analysis. It is assumed that the nonlinearity is non-convex and its solution is isolated, under some conditions the corresponding linearized problem has a unique solution. By use of the compactness of the solution family and the contradiction argument, in general conditions, the high order regularity of the solution u∈H1+α,α>0 is proved. Assume that some initial value searched by suitably many eigenbases is already fallen into the neighborhood of the isolated solution, then the optimal error estimates of its nonlinear finite element approximation are shown by the duality argument and continuation method.

  19. IQM: an extensible and portable open source application for image and signal analysis in Java.

    Science.gov (United States)

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.

  20. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  1. High resolution analysis of the human transcriptome: detection of extensive alternative splicing independent of transcriptional activity

    Directory of Open Access Journals (Sweden)

    Rouet Fabien

    2009-10-01

    Full Text Available Abstract Background Commercially available microarrays have been used in many settings to generate expression profiles for a variety of applications, including target selection for disease detection, classification, profiling for pharmacogenomic response to therapeutics, and potential disease staging. However, many commercially available microarray platforms fail to capture transcript diversity produced by alternative splicing, a major mechanism for driving proteomic diversity through transcript heterogeneity. Results The human Genome-Wide SpliceArray™ (GWSA, a novel microarray platform, utilizes an existing probe design concept to monitor such transcript diversity on a genome scale. The human GWSA allows the detection of alternatively spliced events within the human genome through the use of exon body and exon junction probes to provide a direct measure of each transcript, through simple calculations derived from expression data. This report focuses on the performance and validation of the array when measured against standards recently published by the Microarray Quality Control (MAQC Project. The array was shown to be highly quantitative, and displayed greater than 85% correlation with the HG-U133 Plus 2.0 array at the gene level while providing more extensive coverage of each gene. Almost 60% of splice events among genes demonstrating differential expression of greater than 3 fold also contained extensive splicing alterations. Importantly, almost 10% of splice events within the gene set displaying constant overall expression values had evidence of transcript diversity. Two examples illustrate the types of events identified: LIM domain 7 showed no differential expression at the gene level, but demonstrated deregulation of an exon skip event, while erythrocyte membrane protein band 4.1 -like 3 was differentially expressed and also displayed deregulation of a skipped exon isoform. Conclusion Significant changes were detected independent of

  2. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  3. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  4. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  5. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  6. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  7. Bioanalyzer: An Efficient Tool for Sequence Retrieval, Analysis and Manipulation

    Directory of Open Access Journals (Sweden)

    Hassan Tariq

    2010-12-01

    Full Text Available Bioanalyzer provides combination of tools that are never assembled together. Software has list of tools that can be important for different researchers. The aim to develop this kind of software is to provide unique set of tools at one platform in a more efficient and better way than the software or web tools available. It is stand-alone application so it can save time and effort to locate individual tools on net. Flexible design has made it easy to expand it in future. We will make it available publicly soon.

  8. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... the number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material...

  9. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  10. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    Science.gov (United States)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  11. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  12. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  13. Abstract Interfaces for Data Analysis —Component Architecture for Data Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    G.Barrand; P.Binko; 等

    2001-01-01

    The fast turnover of software technologies,in particular in the domain of in teractivity(covering user interface and visualisation)makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete.At the HepVis '99 workshop,a working group has been formed to improve the rpoduction of software tools for data analysis in HENP.Beside promoting a distributed development organisation,one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques.An initial domain analysis has come up with several categories(componets)found in typical data analysis tools:historams,Ntuples,Functions,Vectors,Fitter,Plotter,Analyzer and Controller,Special Emphasis was put on reducing the couplings between the categories to a minimum,thus optimising re-use and maintainability of any component individually.The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++(Anaphe/Lizard,Openscientist)and Java(Java Analysis Studio),A special implementation aims at accessing the Java Liraries(through their Abstract Interfaces)from C++.This paper giver an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  16. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  17. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  18. Analysis of Facial Injuries Caused by Power Tools.

    Science.gov (United States)

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  19. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  20. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  1. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  2. Project Final Report: Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James

    2014-01-06

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  3. A Current Review of the Meniscus Imaging: Proposition of a Useful Tool for Its Radiologic Analysis

    Directory of Open Access Journals (Sweden)

    Nicolas Lefevre

    2016-01-01

    Full Text Available The main objective of this review was to present a synthesis of the current literature in order to provide a useful tool to clinician in radiologic analysis of the meniscus. All anatomical descriptions were clearly illustrated by MRI, arthroscopy, and/or drawings. The value of standard radiography is extremely limited for the assessment of meniscal injuries but may be indicated to obtain a differential diagnosis such as osteoarthritis. Ultrasound is rarely used as a diagnostic tool for meniscal pathologies and its accuracy is operator-dependent. CT arthrography with multiplanar reconstructions can detect meniscus tears that are not visible on MRI. This technique is also useful in case of MRI contraindications, in postoperative assessment of meniscal sutures and the condition of cartilage covering the articular surfaces. MRI is the most accurate and less invasive method for diagnosing meniscal lesions. MRI allows confirming and characterizing the meniscal lesion, the type, the extension, its association with a cyst, the meniscal extrusion, and assessing cartilage and subchondral bone. New 3D-MRI in three dimensions with isotropic resolution allows the creation of multiplanar reformatted images to obtain from an acquisition in one sectional plane reconstructions in other spatial planes. 3D MRI should further improve the diagnosis of meniscal tears.

  4. Tools of the Future: How Decision Tree Analysis Will Impact Mission Planning

    Science.gov (United States)

    Otterstatter, Matthew R.

    2005-01-01

    The universe is infinitely complex; however, the human mind has a finite capacity. The multitude of possible variables, metrics, and procedures in mission planning are far too many to address exhaustively. This is unfortunate because, in general, considering more possibilities leads to more accurate and more powerful results. To compensate, we can get more insightful results by employing our greatest tool, the computer. The power of the computer will be utilized through a technology that considers every possibility, decision tree analysis. Although decision trees have been used in many other fields, this is innovative for space mission planning. Because this is a new strategy, no existing software is able to completely accommodate all of the requirements. This was determined through extensive research and testing of current technologies. It was necessary to create original software, for which a short-term model was finished this summer. The model was built into Microsoft Excel to take advantage of the familiar graphical interface for user input, computation, and viewing output. Macros were written to automate the process of tree construction, optimization, and presentation. The results are useful and promising. If this tool is successfully implemented in mission planning, our reliance on old-fashioned heuristics, an error-prone shortcut for handling complexity, will be reduced. The computer algorithms involved in decision trees will revolutionize mission planning. The planning will be faster and smarter, leading to optimized missions with the potential for more valuable data.

  5. Clinical decision support tools: analysis of online drug information databases

    Directory of Open Access Journals (Sweden)

    Seamon Matthew J

    2007-03-01

    Full Text Available Abstract Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information databases. Methods Five commercially available and two freely available online drug information databases were evaluated according to scope (presence or absence of answer, completeness (the comprehensiveness of the answers, and ease of use. Additionally, a composite score integrating all three criteria was utilized. Fifteen weighted categories comprised of 158 questions were used to conduct the analysis. Descriptive statistics and Chi-square were used to summarize the evaluation components and make comparisons between databases. Scheffe's multiple comparison procedure was used to determine statistically different scope and completeness scores. The composite score was subjected to sensitivity analysis to investigate the effect of the choice of percentages for scope and completeness. Results The rankings for the databases from highest to lowest, based on composite scores were Clinical Pharmacology, Micromedex, Lexi-Comp Online, Facts & Comparisons 4.0, Epocrates Online Premium, RxList.com, and Epocrates Online Free. Differences in scope produced three statistical groupings with Group 1 (best performers being: Clinical Pharmacology, Micromedex, Facts & Comparisons 4.0, Lexi-Comp Online, Group 2: Epocrates Premium and RxList.com and Group 3: Epocrates Free (p Conclusion Online drug information databases, which belong to clinical decision support, vary in their ability to answer questions across a range of categories.

  6. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability......, limitation and future enhancement of the present tool are discussed in detail....

  7. An Extensive Analysis of Y-Chromosomal Microsatellite Haplotypes in Globally Dispersed Human Populations

    Science.gov (United States)

    Kayser, Manfred; Krawczak, Michael; Excoffier, Laurent; Dieltjes, Patrick; Corach, Daniel; Pascali, Vincente; Gehrig, Christian; Bernini, Luigi F.; Jespersen, Jørgen; Bakker, Egbert; Roewer, Lutz; de Knijff, Peter

    2001-01-01

    The genetic variance at seven Y-chromosomal microsatellite loci (or short tandem repeats [STRs]) was studied among 986 male individuals from 20 globally dispersed human populations. A total of 598 different haplotypes were observed, of which 437 (73.1%) were each found in a single male only. Population-specific haplotype-diversity values were .86–.99. Analyses of haplotype diversity and population-specific haplotypes revealed marked population-structure differences between more-isolated indigenous populations (e.g., Central African Pygmies or Greenland Inuit) and more-admixed populations (e.g., Europeans or Surinamese). Furthermore, male individuals from isolated indigenous populations shared haplotypes mainly with male individuals from their own population. By analysis of molecular variance, we found that 76.8% of the total genetic variance present among these male individuals could be attributed to genetic differences between male individuals who were members of the same population. Haplotype sharing between populations, ΦST statistics, and phylogenetic analysis identified close genetic affinities among European populations and among New Guinean populations. Our data illustrate that Y-chromosomal STR haplotypes are an ideal tool for the study of the genetic affinities between groups of male subjects and for detection of population structure. PMID:11254455

  8. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning; Mullins, Michael

    2006-01-01

    the two types of tools. The paper therefore looks at integration of the two types in a prototype for a tool which allows aesthetics evaluation, and at the same time gives the architect instant technical feedback on ideas already in the initial sketching phase. The aim of the research is to look...... possible approaches for working with digital tectonics by means of acoustics: The architects, the architect-engineer or hybrid practitioner and finally a prototype for a possible digital tectonic tool. For the third approach in the case study a prototype digital tectonic tool is tested on the design......The digital design tools used by architects and engineers today are very useful with respect to their specific fields of aesthetical or technical evaluation. It is not yet possible to fully use the potential of the computer in the design process, as there is no well functioning interplay between...

  9. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  10. An Analysis of the North Carolina Cooperative Extension Service's Role in Bridging the Digital Divide

    Science.gov (United States)

    Alston, Antoine J.; Hilton, Lashawn; English, Chastity Warren; Elbert, Chanda; Wakefield, Dexter

    2011-01-01

    The study reported here sought to determine the perception of North Carolina County Cooperative Extension directors in regard to the North Carolina Cooperative Extension Service's role in bridging the digital divide. It was perceived by respondents that variables such as income, education, gender, disability status, race/ethnicity, age, and…

  11. Virtual Focus Groups in Extension: A Useful Approach to Audience Analysis

    Science.gov (United States)

    Warner, Laura A.

    2014-01-01

    As change agents, Extension educators may begin their program planning by identifying the audience's perceived barriers and benefits to adopting some behavior that will benefit the community. Extension professionals and researchers have used in-person focus groups to understand an audience, and they can also administer them as…

  12. Three-dimensional finite element progressive failure analysis of composite laminates under axial extension

    Science.gov (United States)

    Reddy, Yeruva S.; Reddy, Junuthula N.

    1993-01-01

    A three-dimensional (3D) progressive failure algorithm is developed, where the layerwise laminate theory (LWLT) of Reddy is used for kinematic description. The finite element model based on the layerwise theory predicts both inplane and interlaminar stresses with the same accuracy as that of a conventional 3D finite element model and provides a convenient format for modeling the 3D stress fields in composite laminates. A parametric study is conducted to investigate the effect of out-of-plane material properties, 3D stiffness reduction methods, and boundary conditions on the failure loads and strains of a composite laminate under axial extension. The results indicate that different parameters have a different degree of influence on the failure loads and strains. The predictive ability of various phenomenological failure criteria is evaluated in the light of experimental results available in the literature, and the predictions of the LWLT are compared with those of the first-order shear deformation theory. It is concluded that a 3D stress analysis is necessary to predict accurately the failure behavior of composite laminates.

  13. New Power Quality Analysis Method Based on Chaos Synchronization and Extension Neural Network

    Directory of Open Access Journals (Sweden)

    Meng-Hui Wang

    2014-10-01

    Full Text Available A hybrid method comprising a chaos synchronization (CS-based detection scheme and an Extension Neural Network (ENN classification algorithm is proposed for power quality monitoring and analysis. The new method can detect minor changes in signals of the power systems. Likewise, prominent characteristics of system signal disturbance can be extracted by this technique. In the proposed approach, the CS-based detection method is used to extract three fundamental characteristics of the power system signal and an ENN-based clustering scheme is then applied to detect the state of the signal, i.e., normal, voltage sag, voltage swell, interruption or harmonics. The validity of the proposed method is demonstrated by means of simulations given the use of three different chaotic systems, namely Lorenz, New Lorenz and Sprott. The simulation results show that the proposed method achieves a high detection accuracy irrespective of the chaotic system used or the presence of noise. The proposed method not only achieves higher detection accuracy than existing methods, but also has low computational cost, an improved robustness toward noise, and improved scalability. As a result, it provides an ideal solution for the future development of hand-held power quality analyzers and real-time detection devices.

  14. Static progressive versus three-point elbow extension splinting: a mathematical analysis.

    Science.gov (United States)

    Chinchalkar, Shrikant J; Pearce, Joshua; Athwal, George S

    2009-01-01

    Elbow joint contractures are often treated by using static progressive, dynamic, turnbuckle, or serial static splinting. These splint designs are effective in regaining functional elbow range of motion due to the high forces applied to the contracted tissues; however, regaining terminal elbow extension remains a challenge. Static progressive splints are commonly used to initiate treatment, however, are considered less effective in regaining terminal extension. Recently, the concept of converting a static progressive splint into a three-point static progressive splint (TPSPS) to regain terminal extension has been introduced. This paper mathematically analyzes the compressive and rotational forces in static progressive and TPSPSs. Our hypothesis was that three-point static progressive splinting was superior to the standard static progressive elbow extension splint in applying rotational forces to the elbow at terminal extension.

  15. Life-cycle cost-benefit analysis of extensive vegetated roof systems.

    Science.gov (United States)

    Carter, Timothy; Keeler, Andrew

    2008-05-01

    The built environment has been a significant cause of environmental degradation in the previously undeveloped landscape. As public and private interest in restoring the environmental integrity of urban areas continues to increase, new construction practices are being developed that explicitly value beneficial environmental characteristics. The use of vegetation on a rooftop--commonly called a green roof--as an alternative to traditional roofing materials is an increasingly utilized example of such practices. The vegetation and growing media perform a number of functions that improve environmental performance, including: absorption of rainfall, reduction of roof temperatures, improvement in ambient air quality, and provision of urban habitat. A better accounting of the green roof's total costs and benefits to society and to the private sector will aid in the design of policy instruments and educational materials that affect individual decisions about green roof construction. This study uses data collected from an experimental green roof plot to develop a benefit cost analysis (BCA) for the life cycle of extensive (thin layer) green roof systems in an urban watershed. The results from this analysis are compared with a traditional roofing scenario. The net present value (NPV) of this type of green roof currently ranges from 10% to 14% more expensive than its conventional counterpart. A reduction of 20% in green roof construction cost would make the social NPV of the practice less than traditional roof NPV. Considering the positive social benefits and relatively novel nature of the practice, incentives encouraging the use of this practice in highly urbanized watersheds are strongly recommended.

  16. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  17. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  18. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  19. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the langua...... can provide for tools that are less powerful in theory, but more practical for use under real-world conditions. We also point out some opportunities for future work in both areas, motivated by our successes and difficulties with the two techniques....

  20. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  1. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  2. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    Science.gov (United States)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  3. Comparative Analysis of Apicoplast-Targeted Protein Extension Lengths in Apicomplexan Parasites.

    Science.gov (United States)

    Seliverstov, Alexandr V; Zverkov, Oleg A; Istomina, Svetlana N; Pirogov, Sergey A; Kitsis, Philip S

    2015-01-01

    In general, the mechanism of protein translocation through the apicoplast membrane requires a specific extension of a functionally important region of the apicoplast-targeted proteins. The corresponding signal peptides were detected in many apicomplexans but not in the majority of apicoplast-targeted proteins in Toxoplasma gondii. In T. gondii signal peptides are either much diverged or their extension region is processed, which in either case makes the situation different from other studied apicomplexans. We propose a statistic method to compare extensions of the functionally important regions of apicoplast-targeted proteins. More specifically, we provide a comparison of extension lengths of orthologous apicoplast-targeted proteins in apicomplexan parasites. We focus on results obtained for the model species T. gondii, Neospora caninum, and Plasmodium falciparum. With our method, cross species comparisons demonstrate that, in average, apicoplast-targeted protein extensions in T. gondii are 1.5-fold longer than in N. caninum and 2-fold longer than in P. falciparum. Extensions in P. falciparum less than 87 residues in size are longer than the corresponding extensions in N. caninum and, reversely, are shorter if they exceed 88 residues.

  4. A Geographic and Functional Network Flow Analysis Tool

    Science.gov (United States)

    2014-06-01

    INFORMATION SYSTEMS TOOLS AND DYSTOPIA .......................................................................................................5 B. MODELS OF...17 IV. CASE STUDY: FIBER OPTIC COMMUNICATIONS BACKBONE IN DYSTOPIA ...15 Figure 8. A simple fiber-optic backbone network for Dystopia . .....................................19 Figure 9

  5. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  6. Non-contact measurement and analysis of machine tool spindles

    OpenAIRE

    Clough, David A; Fletcher, Simon; Longstaff, Andrew P.

    2010-01-01

    Increasing demand on the manufacturing industry to produce tighter tolerance parts means it is\\ud necessary to gain a greater understanding of machine tool capabilities and error sources. A significant source of machine tool errors is down to spindle inaccuracies and performance, leading to part scrapping. Catastrophic spindle failure brings production to a standstill until a new spindle can be procured and installed, resulting in lost production time.\\ud This project aims to assess the effec...

  7. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    Science.gov (United States)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center

  8. Confidence analysis of standard deviational ellipse and its extension into higher dimensional euclidean space.

    Science.gov (United States)

    Wang, Bin; Shi, Wenzhong; Miao, Zelang

    2015-01-01

    Standard deviational ellipse (SDE) has long served as a versatile GIS tool for delineating the geographic distribution of concerned features. This paper firstly summarizes two existing models of calculating SDE, and then proposes a novel approach to constructing the same SDE based on spectral decomposition of the sample covariance, by which the SDE concept is naturally generalized into higher dimensional Euclidean space, named standard deviational hyper-ellipsoid (SDHE). Then, rigorous recursion formulas are derived for calculating the confidence levels of scaled SDHE with arbitrary magnification ratios in any dimensional space. Besides, an inexact-newton method based iterative algorithm is also proposed for solving the corresponding magnification ratio of a scaled SDHE when the confidence probability and space dimensionality are pre-specified. These results provide an efficient manner to supersede the traditional table lookup of tabulated chi-square distribution. Finally, synthetic data is employed to generate the 1-3 multiple SDEs and SDHEs. And exploratory analysis by means of SDEs and SDHEs are also conducted for measuring the spread concentrations of Hong Kong's H1N1 in 2009.

  9. Confidence analysis of standard deviational ellipse and its extension into higher dimensional euclidean space.

    Directory of Open Access Journals (Sweden)

    Bin Wang

    Full Text Available Standard deviational ellipse (SDE has long served as a versatile GIS tool for delineating the geographic distribution of concerned features. This paper firstly summarizes two existing models of calculating SDE, and then proposes a novel approach to constructing the same SDE based on spectral decomposition of the sample covariance, by which the SDE concept is naturally generalized into higher dimensional Euclidean space, named standard deviational hyper-ellipsoid (SDHE. Then, rigorous recursion formulas are derived for calculating the confidence levels of scaled SDHE with arbitrary magnification ratios in any dimensional space. Besides, an inexact-newton method based iterative algorithm is also proposed for solving the corresponding magnification ratio of a scaled SDHE when the confidence probability and space dimensionality are pre-specified. These results provide an efficient manner to supersede the traditional table lookup of tabulated chi-square distribution. Finally, synthetic data is employed to generate the 1-3 multiple SDEs and SDHEs. And exploratory analysis by means of SDEs and SDHEs are also conducted for measuring the spread concentrations of Hong Kong's H1N1 in 2009.

  10. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  11. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  12. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  13. Mass balance re-analysis of Findelengletscher, Switzerland; benefits of extensive snow accumulation measurements

    Directory of Open Access Journals (Sweden)

    Leo eSold

    2016-02-01

    Full Text Available A re-analysis is presented here of a 10-year mass balance series at Findelengletscher, a temperate mountain glacier in Switzerland. Calculating glacier-wide mass balance from the set of glaciological point balance observations using conventional approaches, such as the profile or contour method, resulted in significant deviations from the reference value given by the geodetic mass change over a five-year period. This is attributed to the sparsity of observations at high elevations and to the inability of the evaluation schemes to adequately estimate accumulation in unmeasured areas. However, measurements of winter mass balance were available for large parts of the study period from snow probings and density pits. Complementary surveys by helicopter-borne ground-penetrating radar (GPR were conducted in three consecutive years. The complete set of seasonal observations was assimilated using a distributed mass balance model. This model-based extrapolation revealed a substantial mass loss at Findelengletscher of -0.43m w.e. a^-1 between 2004 and 2014, while the loss was less pronounced for its former tributary, Adlergletscher (-0.30m w.e. a^-1. For both glaciers, the resulting time series were within the uncertainty bounds of the geodetic mass change. We show that the model benefited strongly from the ability to integrate seasonal observations. If no winter mass balance measurements were available and snow cover was represented by a linear precipitation gradient, the geodetic mass balance was not matched. If winter balance measurements by snow probings and snow density pits were taken into account, the model performance was substantially improved but still showed a significant bias relative to the geodetic mass change. Thus the excellent agreement of the model-based extrapolation with the geodetic mass change was owed to an adequate representation of winter accumulation distribution by means of extensive GPR measurements.

  14. Expressed sequence tags as a tool for phylogenetic analysis of placental mammal evolution.

    Directory of Open Access Journals (Sweden)

    Morgan Kullberg

    Full Text Available BACKGROUND: We investigate the usefulness of expressed sequence tags, ESTs, for establishing divergences within the tree of placental mammals. This is done on the example of the established relationships among primates (human, lagomorphs (rabbit, rodents (rat and mouse, artiodactyls (cow, carnivorans (dog and proboscideans (elephant. METHODOLOGY/PRINCIPAL FINDINGS: We have produced 2000 ESTs (1.2 mega bases from a marsupial mouse and characterized the data for their use in phylogenetic analysis. The sequences were used to identify putative orthologous sequences from whole genome projects. Although most ESTs stem from single sequence reads, the frequency of potential sequencing errors was found to be lower than allelic variation. Most of the sequences represented slowly evolving housekeeping-type genes, with an average amino acid distance of 6.6% between human and mouse. Positive Darwinian selection was identified at only a few single sites. Phylogenetic analyses of the EST data yielded trees that were consistent with those established from whole genome projects. CONCLUSIONS: The general quality of EST sequences and the general absence of positive selection in these sequences make ESTs an attractive tool for phylogenetic analysis. The EST approach allows, at reasonable costs, a fast extension of data sampling from species outside the genome projects.

  15. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  16. Multivariate analysis of ultrasound-recorded dorsal strain sequences: Investigation of dynamic neck extensions in women with chronic whiplash associated disorders

    Science.gov (United States)

    Peolsson, Anneli; Peterson, Gunnel; Trygg, Johan; Nilsson, David

    2016-08-01

    Whiplash Associated Disorders (WAD) refers to the multifaceted and chronic burden that is common after a whiplash injury. Tools to assist in the diagnosis of WAD and an increased understanding of neck muscle behaviour are needed. We examined the multilayer dorsal neck muscle behaviour in nine women with chronic WAD versus healthy controls during the entire sequence of a dynamic low-loaded neck extension exercise, which was recorded using real-time ultrasound movies with high frame rates. Principal component analysis and orthogonal partial least squares were used to analyse mechanical muscle strain (deformation in elongation and shortening). The WAD group showed more shortening during the neck extension phase in the trapezius muscle and during both the neck extension and the return to neutral phase in the multifidus muscle. For the first time, a novel non-invasive method is presented that is capable of detecting altered dorsal muscle strain in women with WAD during an entire exercise sequence. This method may be a breakthrough for the future diagnosis and treatment of WAD.

  17. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  18. Road safety risk evaluation and target setting using data envelopment analysis and its extensions.

    Science.gov (United States)

    Shen, Yongjun; Hermans, Elke; Brijs, Tom; Wets, Geert; Vanhoof, Koen

    2012-09-01

    Currently, comparison between countries in terms of their road safety performance is widely conducted in order to better understand one's own safety situation and to learn from those best-performing countries by indicating practical targets and formulating action programmes. In this respect, crash data such as the number of road fatalities and casualties are mostly investigated. However, the absolute numbers are not directly comparable between countries. Therefore, the concept of risk, which is defined as the ratio of road safety outcomes and some measure of exposure (e.g., the population size, the number of registered vehicles, or distance travelled), is often used in the context of benchmarking. Nevertheless, these risk indicators are not consistent in most cases. In other words, countries may have different evaluation results or ranking positions using different exposure information. In this study, data envelopment analysis (DEA) as a performance measurement technique is investigated to provide an overall perspective on a country's road safety situation, and further assess whether the road safety outcomes registered in a country correspond to the numbers that can be expected based on the level of exposure. In doing so, three model extensions are considered, which are the DEA based road safety model (DEA-RS), the cross-efficiency method, and the categorical DEA model. Using the measures of exposure to risk as the model's input and the number of road fatalities as output, an overall road safety efficiency score is computed for the 27 European Union (EU) countries based on the DEA-RS model, and the ranking of countries in accordance with their cross-efficiency scores is evaluated. Furthermore, after applying clustering analysis to group countries with inherent similarity in their practices, the categorical DEA-RS model is adopted to identify best-performing and underperforming countries in each cluster, as well as the reference sets or benchmarks for those

  19. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    Science.gov (United States)

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique.

  20. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  1. An Evaluation of Visual and Textual Network Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Goodall, John R [ORNL

    2011-01-01

    User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This article presents the results of a comparative evaluation between a visualization-based application and a more traditional, table-based application for analyzing computer network packet captures. We conducted this evaluation as part of the user-centered design process. Participants performed both structured, well-defined tasks and exploratory, open-ended tasks with both tools. We measured accuracy and efficiency for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study design presented may be useful for future researchers performing user testing on visualization for cyber security applications.

  2. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    Science.gov (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  3. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  4. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    Three fractured tool dies used for industrial cold forging have been investigated using light optical microscopy and scanning electron microscopy. Two of the specimens were produced using the traditional Böhler P/M steel grade s790, while the lastspecimen was a third generation P/M steel produced...... using new technology developed by Böhler. All three steels have the same nominal composition of alloying elements. The failure in both types of material occurs as a crack formation at a notch inside ofthe tool. Generally the cold forging dies constructed in third generation steels have a longer lifetime...

  5. Extension of Characteristic Equation Method to Stability Analysis of Equilibrium Points for Closed—Loop PWM Power Switching Converters8

    Institute of Scientific and Technical Information of China (English)

    YanfengCHEN; ShuishengQIU; 等

    1999-01-01

    An extension of characteristic equation analysis method to the stability analysis of equilibrium points for closed-loop PWM power switching converters is introduced based on equivalent small parameter method.The basic principle of the method is described in detail.The provided example shows that the method,incorporating with the system's state-plane trajectories,offers the advantages of both simplicity and practicality.

  6. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper.

  7. Extensions of indication throughout the drug product lifecycle: a quantitative analysis.

    Science.gov (United States)

    Langedijk, Joris; Whitehead, Christopher J; Slijkerman, Diederick S; Leufkens, Hubert G M; Schutjens, Marie-Hélène D B; Mantel-Teeuwisse, Aukje K

    2016-02-01

    The marketing authorisation of the first generic product version is an important moment in a drug product lifecycle. The subsequently changed intellectual property protection prospects could affect the incentives for further drug development. We assessed the quantity and nature of extensions of indication of small molecule medicinal products authorised through the European Medicines Agency throughout the drug product lifecycle with special attention for the impact of the introduction of a first generic competitor. The majority (92.5%) of the extensions of indication was approved during the exclusivity period of the innovator product. Regulatory rethinking might be needed for a sustainable stimulation of extensions of indications in the post-generic period of a drug product lifecycle.

  8. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    Science.gov (United States)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  9. Transient Side Load Analysis of Out-of-Round Film-Cooled Nozzle Extensions

    Science.gov (United States)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2012-01-01

    There was interest in understanding the impact of out-of-round nozzle extension on the nozzle side load during transient startup operations. The out-of-round nozzle extension could be the result of asymmetric internal stresses, deformation induced by previous tests, and asymmetric loads induced by hardware attached to the nozzle. The objective of this study was therefore to computationally investigate the effect of out-of-round nozzle extension on the nozzle side loads during an engine startup transient. The rocket engine studied encompasses a regeneratively cooled chamber and nozzle, along with a film cooled nozzle extension. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and transient inlet boundary flow properties derived from an engine system simulation. Six three-dimensional cases were performed with the out-of-roundness achieved by three different degrees of ovalization, elongated on lateral y and z axes: one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation line jump was the primary source of the peak side loads. Comparing to the peak side load of the perfectly round nozzle, the peak side loads increased for the slightly and more ovalized nozzle extensions, and either increased or decreased for the two significantly ovalized nozzle extensions. A theory based on the counteraction of the flow destabilizing effect of an exacerbated asymmetrical flow caused by a lower degree of ovalization, and the flow stabilizing effect of a more symmetrical flow, created also by ovalization, is presented to explain the observations obtained in this effort.

  10. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 11: PEP, cost, schedules, and work breakdown structure dictionary

    Science.gov (United States)

    1979-01-01

    Cost scheduling and funding data are presented for the reference design of the power extension package. Major schedule milestones are correlated with current Spacelab flight dates. Funding distributions provide for minimum expenditure during the first year of the project.

  11. ProteinHistorian: tools for the comparative analysis of eukaryote protein origin.

    Directory of Open Access Journals (Sweden)

    John A Capra

    Full Text Available The evolutionary history of a protein reflects the functional history of its ancestors. Recent phylogenetic studies identified distinct evolutionary signatures that characterize proteins involved in cancer, Mendelian disease, and different ontogenic stages. Despite the potential to yield insight into the cellular functions and interactions of proteins, such comparative phylogenetic analyses are rarely performed, because they require custom algorithms. We developed ProteinHistorian to make tools for performing analyses of protein origins widely available. Given a list of proteins of interest, ProteinHistorian estimates the phylogenetic age of each protein, quantifies enrichment for proteins of specific ages, and compares variation in protein age with other protein attributes. ProteinHistorian allows flexibility in the definition of protein age by including several algorithms for estimating ages from different databases of evolutionary relationships. We illustrate the use of ProteinHistorian with three example analyses. First, we demonstrate that proteins with high expression in human, compared to chimpanzee and rhesus macaque, are significantly younger than those with human-specific low expression. Next, we show that human proteins with annotated regulatory functions are significantly younger than proteins with catalytic functions. Finally, we compare protein length and age in many eukaryotic species and, as expected from previous studies, find a positive, though often weak, correlation between protein age and length. ProteinHistorian is available through a web server with an intuitive interface and as a set of command line tools; this allows biologists and bioinformaticians alike to integrate these approaches into their analysis pipelines. ProteinHistorian's modular, extensible design facilitates the integration of new datasets and algorithms. The ProteinHistorian web server, source code, and pre-computed ages for 32 eukaryotic genomes are

  12. Torque-wrench extension

    Science.gov (United States)

    Peterson, D. H.

    1981-01-01

    Torque-wrench extension makes it easy to install and remove fasteners that are beyond reach of typical wrenches or are located in narrow spaces that prevent full travel of wrench handle. At same time, tool reads applied torque accurately. Wrench drive system, for torques up to 125 inch-pounds, uses 2 standard drive-socket extensions in aluminum frame. Extensions are connected to bevel gear that turns another bevel gear. Gears produce 1:1 turn ratio through 90 degree translation of axis of rotation. Output bevel has short extension that is used to attach 1/4-inch drive socket.

  13. Extensions of indication throughout the drug product lifecycle: a quantitative analysis

    NARCIS (Netherlands)

    Langedijk, Joris; Whitehead, Christopher J; Slijkerman, Diederick S; Leufkens, Hubert G M; Schutjens, Marie-Hélène D B; Mantel-Teeuwisse, Aukje K

    2016-01-01

    The marketing authorisation of the first generic product version is an important moment in a drug product lifecycle. The subsequently changed intellectual property protection prospects could affect the incentives for further drug development. We assessed the quantity and nature of extensions of indi

  14. Using R-Project for Free Statistical Analysis in Extension Research

    Science.gov (United States)

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  15. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  16. Proteomic tools for the analysis of transient interactions between metalloproteins.

    Science.gov (United States)

    Martínez-Fábregas, Jonathan; Rubio, Silvia; Díaz-Quintana, Antonio; Díaz-Moreno, Irene; De la Rosa, Miguel Á

    2011-05-01

    Metalloproteins play major roles in cell metabolism and signalling pathways. In many cases, they show moonlighting behaviour, acting in different processes, depending on the physiological state of the cell. To understand these multitasking proteins, we need to discover the partners with which they carry out such novel functions. Although many technological and methodological tools have recently been reported for the detection of protein interactions, specific approaches to studying the interactions involving metalloproteins are not yet well developed. The task is even more challenging for metalloproteins, because they often form short-lived complexes that are difficult to detect. In this review, we gather the different proteomic techniques and biointeractomic tools reported in the literature. All of them have shown their applicability to the study of transient and weak protein-protein interactions, and are therefore suitable for metalloprotein interactions.

  17. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  18. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    regression representations of more complex M&S tools.4 C2WindTunnel. C2 WindTunnel is a software test bed developed by George Mason for Command and...Technology Project, U.S. Marine Corps Systems Command.    5  Roth , Karen; Barrett, Shelby. 2009 (July). Command and Control Wind Tunnel Integration

  19. Design tools for daylighting illumination and energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, S.

    1982-07-01

    The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

  20. Clinical decision support tools: analysis of online drug information databases

    OpenAIRE

    Seamon Matthew J; Polen Hyla H; Marsh Wallace A; Clauson Kevin A; Ortiz Blanca I

    2007-01-01

    Abstract Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information datab...

  1. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion

    2009-10-01

    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  2. Kinematic Analysis of a New Parallel Machine Tool: the Orthoglide

    CERN Document Server

    Wenger, Philippe

    2007-01-01

    This paper describes a new parallel kinematic architecture for machining applications: the orthoglide. This machine features three fixed parallel linear joints which are mounted orthogonally and a mobile platform which moves in the Cartesian x-y-z space with fixed orientation. The main interest of the orthoglide is that it takes benefit from the advantages of the popular PPP serial machines (regular Cartesian workspace shape and uniform performances) as well as from the parallel kinematic arrangement of the links (less inertia and better dynamic performances), which makes the orthoglide well suited to high-speed machining applications. Possible extension of the orthoglide to 5-axis machining is also investigated.

  3. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  4. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    Science.gov (United States)

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  5. DFTCalc: reliability centered maintenance via fault tree analysis (tool paper)

    NARCIS (Netherlands)

    Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha

    2015-01-01

    Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.

  6. Propositional Analysis: A Tool for Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce

    1989-01-01

    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  7. MAAP: a versatile and universal tool for genome analysis.

    Science.gov (United States)

    Caetano-Anollés, G

    1994-09-01

    Multiple arbitrary amplicon profiling (MAAP) uses one or more oligonucleotide primers (> or = 5 nt) of arbitrary sequence to initiate DNA amplification and generate characteristic fingerprints from anonymous genomes or DNA templates. MAAP markers can be used in general fingerprinting as well as in mapping applications, either directly or as sequence-characterized amplified regions (SCARs). MAAP profiles can be tailored in the number of monomorphic and/or polymorphic products. For example, multiple endonuclease digestion of template DNA or the use of mini-hairpin primers can enhance detection of polymorphic DNA. Comparison of the expected and actual number of amplification products produced with primers differing in length, sequence and GC content from templates of varying complexity reveal severe departures from theoretical formulations with interesting implications in primer-template interaction. Extensive primer-template mismatching can occur when using templates of low complexity or long primers. Primer annealing and extension appears directed by an 8 nt 3'-terminal primer domain, requires sites with perfect homology to the first 5-6 nt fom the 3' terminus, and involves direct physical interaction between amplicon annealing sites.

  8. Hyperbolic Error Analysis and Parametric Optimization of Round Body Form Tool

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    with the merits of the easy manufacture and the long service life and the processing the inside or outside form surface, round body form tool is extensive use in large scales production. Its main demerit is the big hyperbolic error which is caused in the process of processing cone, but about the discussion of hyperbolic error, there are two drawbacks in the current books and documents: (1) The error measuring plane is established on the rake face of tool, which doesn't coincide with the actual measuring pl...

  9. Generalized Aliasing as a Basis for Program Analysis Tools

    Science.gov (United States)

    2000-11-01

    applications are described in the next chapter, in Section 9.2.2.) For example, the Ladybug specification checker tool [44] has a user interface shell...any particular implementation of the interface. At run time, Ladybug uses reflection to load the engine class by name and create an object of that...supplied with Sun’s JDK 1.1.7 Jess Java Expert System Shell version 4.4, from Sandia National Labs [35] Ladybug The Ladybug specification checker, by Craig

  10. Efficiency of village extension agents in Nigeria: Evidence from a data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Ibrahim Hassan I.

    2016-01-01

    Full Text Available Determining the technical efficiency of extension personnel especially at the village level is paramount if farm productivity is to be increased. The present study determined the technical efficiency of Village Extension Agents (VEAs in North Central Nigeria. Data for the study were collected using structured questionnaire that was administered on 81 VEAs. The findings of the study indicated that 32.1% of the VEAs were aged between 38 and 45 years with a mean age of 41 years; while 50.6% were holders of national diploma certificates. The monthly income of a VEA ranged between N16,000 and N21,000. The average technical efficiency of VEAs was 42% with minimum and maximum values of 0.03 and 1 respectively. There was a positive significant association between the age (P<0.10, education (P<0. 10 and income (P<0.01 of VEAs and their technical efficiency levels. The results imply that prompt payment of allowances/salary, regular promotions and trainings are the necessary impetus that can improve agricultural extension service delivery in Nigeria, particularly at the village level.

  11. A Cognitive Comparative Analysis of UP-DOWN Metaphorical Extension in Chinese and English

    Institute of Scientific and Technical Information of China (English)

    张长永

    2011-01-01

    This paper aims at making some research on spatial metaphor from cognitive angle.Spatial metaphor is a basic tool which can make us more familiar with our world,and lots of abstract concepts are based on it.Take "UP-DOWN" for example,this paper will mainl

  12. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  13. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for complex fluid/structure interaction phenomena is increasing as proven numerical and visualization...

  14. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  15. Extension of physical component BFC method for the analysis of free-surface flows coupled with moving boundaries

    Science.gov (United States)

    Lu, D.; Takizawa, A.; Kondo, S.

    A newly developed ``physical component boundary fitted coordinate (PCBFC) method'' is extended for the analysis of free-surface flows coupled with moving boundaries. Extra techniques are employed to deal with the coupling movement of the free surface and moving boundaries. After the validation of the extension by several benchmark problems, the method is successfully applied for the first time to the simulation of overflow-induced vibration of the weir coupled with sloshing of the free-surface liquid.

  16. Code Analysis and Refactoring with Clang Tools, Version 0.1

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  17. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...

  18. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton biomas

  19. Open Source Software Tools for Anomaly Detection Analysis

    Science.gov (United States)

    2014-04-01

    Environment for Developing KDD-Applications Supported by Index-Structures (ELKI), RapidMiner , SHOGUN (toolbox) Waikato Environment for Knowledge Analysis...Structures (ELKI) 1 3. RapidMiner 2 4. SHOGUN (toolbox) 3 5. Waikato Environment for Knowledge Analysis (Weka) (Machine Learning) 4 6. Scikit-Learn 5 7...2 Figure 2. RapidMiner output results (7

  20. Cellular barcoding tool for clonal analysis in the hematopoietic system

    NARCIS (Netherlands)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J.; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J. C.; de Haan, Gerald; Bystrykh, Leonid V.

    2010-01-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blott

  1. Teaching the Tools of Pharmaceutical Care Decision-Analysis.

    Science.gov (United States)

    Rittenhouse, Brian E.

    1994-01-01

    A method of decision-analysis in pharmaceutical care that integrates epidemiology and economics is presented, including an example illustrating both the deceptive nature of medical decision making and the power of decision analysis. Principles in determining both general and specific probabilities of interest and use of decision trees for…

  2. Miscue Analysis: A Transformative Tool for Researchers, Teachers, and Readers

    Science.gov (United States)

    Goodman, Yetta M.

    2015-01-01

    When a reader produces a response to a written text (the observed response) that is not expected by the listener, the result is called a miscue. Using psychosociolingustic analyses of miscues in the context of an authentic text, miscue analysis provides evidence to discover how readers read. I present miscue analysis history and development and…

  3. Defining the knee joint flexion-extension axis for purposes of quantitative gait analysis: an evaluation of methods.

    Science.gov (United States)

    Schache, Anthony G; Baker, Richard; Lamoreux, Larry W

    2006-08-01

    Minimising measurement variability associated with hip axial rotation and avoiding knee joint angle cross-talk are two fundamental objectives of any method used to define the knee joint flexion-extension axis for purposes of quantitative gait analysis. The aim of this experiment was to compare three different methods of defining this axis: the knee alignment device (KAD) method, a method based on the transepicondylar axis (TEA) and an alternative numerical method (Dynamic). The former two methods are common approaches that have been applied clinically in many quantitative gait analysis laboratories; the latter is an optimisation procedure. A cohort of 20 subjects performed three different functional tasks (normal gait; squat; non-weight bearing knee flexion) on repeated occasions. Three-dimensional hip and knee angles were computed using the three alternative methods of defining the knee joint flexion-extension axis. The repeatability of hip axial rotation measurements during normal gait was found to be significantly better for the Dynamic method (pknee varus-valgus kinematic profile and the degree of knee joint angle cross-talk were smallest for the Dynamic method across all functional tasks. The Dynamic method therefore provided superior results in comparison to the KAD and TEA-based methods and thus represents an attractive solution for orientating the knee joint flexion-extension axis for purposes of quantitative gait analysis.

  4. omniSpect: an open MATLAB-based tool for visualization and analysis of matrix-assisted laser desorption/ionization and desorption electrospray ionization mass spectrometry images.

    Science.gov (United States)

    Parry, R Mitchell; Galhena, Asiri S; Gamage, Chaminda M; Bennett, Rachel V; Wang, May D; Fernández, Facundo M

    2013-04-01

    We present omniSpect, an open source web- and MATLAB-based software tool for both desorption electrospray ionization (DESI) and matrix-assisted laser desorption ionization (MALDI) mass spectrometry imaging (MSI) that performs computationally intensive functions on a remote server. These functions include converting data from a variety of file formats into a common format easily manipulated in MATLAB, transforming time-series mass spectra into mass spectrometry images based on a probe spatial raster path, and multivariate analysis. OmniSpect provides an extensible suite of tools to meet the computational requirements needed for visualizing open and proprietary format MSI data.

  5. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  6. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  7. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  8. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    Science.gov (United States)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  9. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez

    2015-09-01

    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  10. Limits, limits everywhere the tools of mathematical analysis

    CERN Document Server

    Applebaum, David

    2012-01-01

    A quantity can be made smaller and smaller without it ever vanishing. This fact has profound consequences for science, technology, and even the way we think about numbers. In this book, we will explore this idea by moving at an easy pace through an account of elementary real analysis and, in particular, will focus on numbers, sequences, and series.Almost all textbooks on introductory analysis assume some background in calculus. This book doesn't and, instead, the emphasis is on the application of analysis to number theory. The book is split into two parts. Part 1 follows a standard university

  11. ANALYSIS OF ISOKINETIC KNEE EXTENSION / FLEXION IN MALE ELITE ADOLESCENT WRESTLERS

    Directory of Open Access Journals (Sweden)

    Sanli Sadi Kurdak

    2005-12-01

    Full Text Available Wrestling requires strength of the upper and lower body musculature which is critical for the athletic performance. Evaluation of the adolescent's skeletal muscle is important to understand body movement, especially including those involved in sports. Strength, power and endurance capacity are defined as parameters of skeletal muscle biomechanical properties. The isokinetic dynamometer is an important toll for making this type of evaluation. However, load range phase of range of motion has to be considered to interpret the data correctly. With this in mind we aimed to investigate the lover body musculature contractile characteristics of adolescent wrestlers together with detailed analyses of load range phase of motion. Thirteen boys aged 12 - 14 years participated to this study. Concentric load range torque, work and power of knee extension and flexion were measured by a Cybex Norm dynamometer at angular velocities from 450°/sec to 30°/sec with 30°/sec decrements for each set. None of the wrestlers were able to attain load range for angular velocities above 390°/sec and 420°/sec for extension and flexion respectively. Detailed analyses of the load range resulted in statistically significant differences in the normalized load range peak torque for extension at 270°/sec (1.44 ± 0.28 Nm·kg-1 and 1.14 ± 0.28 Nm·kg-1 for total and load range peak torque respectively, p < 0.05, and for flexion at 300°/sec (1.26 ± 0.28 Nm·kg-1 and 1.03 ± 0.23 Nm·kg-1 for total and load range peak torque respectively, p < 0.05, compared to total peak torque data. Similarly, the significant difference was found for the work values at 90°/sec (1.91 ± 0.23 Nm·kg-1 and 1.59 ± 0.24 Nm·kg-1 for total and load range work respectively for extension and 1.73 ± 0.21 Nm·kg-1 and 1.49 ± 0.19 Nm·kg-1 for total and load range work respectively for flexion, p < 0.05, and was evident at higher angular velocities (p < 0.001 for both extension and flexion. At

  12. Electromyographic Analysis of the Hip Extension Pattern in Visually Impaired Athletes.

    Science.gov (United States)

    Halski, Tomasz; Żmijewski, Piotr; Cięszczyk, Paweł; Nowak, Barbara; Ptaszkowski, Kuba; Slupska, Lucyna; Dymarek, Robert; Taradaj, Jakub

    2015-11-22

    The objective of the study was to determine the order of muscle recruitment during the active hip joint extension in particular positions in young visually impaired athletes. The average recruitment time (ART) of the gluteus maximus (GM) and the hamstring muscle group (HMG) was assessed by the means of surface electromyography (sEMG). The sequence of muscle recruitment in the female and male group was also taken into consideration. This study followed a prospective, cross - sectional, randomised design, where 76 visually impaired athletes between the age of 18-25 years were enrolled into the research and selected on chosen inclusion and exclusion criteria. Finally, 64 young subjects (32 men and 32 women) were included in the study (age: 21.1 ± 1.05 years; body mass: 68.4 ± 12.4 kg; body height: 1.74 ± 0.09 m; BMI: 22.20 ± 2.25 kg/m2). All subjects were analysed for the ART of the GM and HMG during the active hip extension performed in two different positions, as well as resting and functional sEMG activity of each muscle. Between gender differences were comprised and the correlations between the ART of the GM and HMG with their functional sEMG activity during hip extension in both positions were shown. No significant differences between the ART of the GM and HMG were found (p>0.05). Furthermore, there was no significant difference of ART among both tested positions, as well in male as female subjects (p>0.05).

  13. Electromyographic Analysis of the Hip Extension Pattern in Visually Impaired Athletes

    Directory of Open Access Journals (Sweden)

    Halski Tomasz

    2015-12-01

    Full Text Available The objective of the study was to determine the order of muscle recruitment during the active hip joint extension in particular positions in young visually impaired athletes. The average recruitment time (ART of the gluteus maximus (GM and the hamstring muscle group (HMG was assessed by the means of surface electromyography (sEMG. The sequence of muscle recruitment in the female and male group was also taken into consideration. This study followed a prospective, cross – sectional, randomised design, where 76 visually impaired athletes between the age of 18–25 years were enrolled into the research and selected on chosen inclusion and exclusion criteria. Finally, 64 young subjects (32 men and 32 women were included in the study (age: 21.1 ± 1.05 years; body mass: 68.4 ± 12.4 kg; body height: 1.74 ± 0.09 m; BMI: 22.20 ± 2.25 kg/m2. All subjects were analysed for the ART of the GM and HMG during the active hip extension performed in two different positions, as well as resting and functional sEMG activity of each muscle. Between gender differences were comprised and the correlations between the ART of the GM and HMG with their functional sEMG activity during hip extension in both positions were shown. No significant differences between the ART of the GM and HMG were found (p>0.05. Furthermore, there was no significant difference of ART among both tested positions, as well in male as female subjects (p>0.05.

  14. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  15. Analysis of 3D and multiview extensions of the emerging HEVC standard

    Science.gov (United States)

    Vetro, Anthony; Tian, Dong

    2012-10-01

    Standardization of a new set of 3D formats has been initiated with the goal of improving the coding of stereo and multiview video, and also facilitating the generation of multiview output needed for auto-stereoscopic displays. Part of this effort will develop 3D and multiview extensions of the emerging standard for High Efficiency Video Coding (HEVC). This paper outlines some of the key technologies and architectures being considered for standardization, and analyzes the viability, benefits and drawbacks of different codec designs.

  16. An analysis of a minimal vectorlike extension of the Standard Model

    CERN Document Server

    Beylin, V; Kuksa, V; Volchanskiy, N

    2016-01-01

    We analyze an extension of the Standard Model with an additional SU(2) hypercolor gauge group keeping the Higgs boson as a fundamental field. Vectorlike interactions of new hyperquarks with the intermediate vector bosons are explicitly constructed. We also consider pseudo-Nambu--Goldstone bosons caused by the symmetry breaking SU(4)-> Sp(4). A specific global symmetry of the model with zero hypercharge of the hyperquark doublets ensures the stability of a neutral pseudoscalar field. Some possible manifestations of the lightest states at colliders are also examined.

  17. Analysis of the extensive air showers of ultra-high energy

    Energy Technology Data Exchange (ETDEWEB)

    Mikhailov, Aleksei A. [Yu.G. Shafer Institute of Cosmophysical Research and Aeronomy, 31 Lenin Ave., 677980 Yakutsk (Russian Federation)

    2011-03-15

    When we study extensive air showers (EAS), which correlate with pulsars, we had been found showers without a muon component. Here we analyzed the arrival directions of EAS with poor and without a muon component. We find that the arrival directions of these showers correlate with some pulsars which are distributed more isotropy. Among these pulsars with the short period of rotation around their axis are prevailed than it is expected by the catalogue of pulsars. In this connection the data of world arrays are considered.

  18. Integrative genomic analysis by interoperation of bioinformatics tools in GenomeSpace

    Science.gov (United States)

    Thorvaldsdottir, Helga; Liefeld, Ted; Ocana, Marco; Borges-Rivera, Diego; Pochet, Nathalie; Robinson, James T.; Demchak, Barry; Hull, Tim; Ben-Artzi, Gil; Blankenberg, Daniel; Barber, Galt P.; Lee, Brian T.; Kuhn, Robert M.; Nekrutenko, Anton; Segal, Eran; Ideker, Trey; Reich, Michael; Regev, Aviv; Chang, Howard Y.; Mesirov, Jill P.

    2015-01-01

    Integrative analysis of multiple data types to address complex biomedical questions requires the use of multiple software tools in concert and remains an enormous challenge for most of the biomedical research community. Here we introduce GenomeSpace (http://www.genomespace.org), a cloud-based, cooperative community resource. Seeded as a collaboration of six of the most popular genomics analysis tools, GenomeSpace now supports the streamlined interaction of 20 bioinformatics tools and data resources. To facilitate the ability of non-programming users’ to leverage GenomeSpace in integrative analysis, it offers a growing set of ‘recipes’, short workflows involving a few tools and steps to guide investigators through high utility analysis tasks. PMID:26780094

  19. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  20. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  1. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  2. Determination of Lateral Extension of Hydrocarbon Concentration Sealing Caprocks by AVO Analysis

    Institute of Scientific and Technical Information of China (English)

    Li Weilian

    2007-01-01

    The caprock is one of the key factors for a reservoir, especially for a gas reservoir. Whether the caprocks can block off the gas is of significance for the accumulation and preservation of the gas reservoir. In this paper, we use the Amplitude versus offset (AVO) seismic technique to determine the lateral extension of the hydrocarbon concentration sealing caprocks. The essence of this technique is to detect the variations of the reservoir bed physical properties by monitoring the variations of the reflection coefficient of seismic waves upon the interfaces between different lithologies.Generally it is used to indicate hydrocarbon directly. For the hydrocarbon concentration sealing caprocks, the change of hydrocarbon concentration may cause the change of physical properties of the caprocks. Therefore it is possible to evaluate the hydrocarbon concentration sealing ability of the caprocks by AVO. This paper presents a case study using AVO to determine the lateral extension of the hydrocarbon concentration sealing caprocks. The result shows that this method is helpful for the exploration of the region.

  3. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  4. Droplet microfluidics--a tool for single-cell analysis.

    Science.gov (United States)

    Joensson, Haakan N; Andersson Svahn, Helene

    2012-12-03

    Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding.

  5. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    Directory of Open Access Journals (Sweden)

    Maike Kathrin Aurich

    2016-08-01

    Full Text Available Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools , we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  6. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models.

    Science.gov (United States)

    Aurich, Maike K; Fleming, Ronan M T; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  7. Simpler methods do it better: Success of Recurrence Quantification Analysis as a general purpose data analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Charles L., E-mail: cwebber@lumc.ed [Department of Cell and Molecular Physiology, Loyola University Medical Center, Maywood, IL (United States); Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research (PIK), 14412 Potsdam (Germany); Facchini, Angelo, E-mail: a.facchini@unisi.i [Center the Study of Complex Systmes and Department of Information Enginering, University of Siena, 53100 Siena (Italy); Giuliani, Alessandro, E-mail: alessandro.giuliani@iss.i [Environment and Health Department, Istituto Superiore di Sanita, Roma (Italy)

    2009-10-05

    Over the last decade, Recurrence Quantification Analysis (RQA) has become a new standard tool in the toolbox of nonlinear methodologies. In this Letter we trace the history and utility of this powerful tool and cite some common applications. RQA continues to wend its way into numerous and diverse fields of study.

  8. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  9. Measures of radioactivity: a tool for understanding statistical data analysis

    CERN Document Server

    Montalbano, Vera

    2012-01-01

    A learning path on radioactivity in the last class of high school is presented. An introduction to radioactivity and nuclear phenomenology is followed by measurements of natural radioactivity. Background and weak sources are monitored for days or weeks. The data are analyzed in order to understand the importance of statistical analysis in modern physics.

  10. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  11. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  12. Analysis of extensively washed hair from cocaine users and drug chemists to establish new reporting criteria.

    Science.gov (United States)

    Morris-Kukoski, Cynthia L; Montgomery, Madeline A; Hammer, Rena L

    2014-01-01

    Samples from a self-proclaimed cocaine (COC) user, from 19 drug users (postmortem) and from 27 drug chemists were extensively washed and analyzed for COC, benzoylecgonine, norcocaine (NC), cocaethylene (CE) and aryl hydroxycocaines by liquid chromatography-tandem mass spectrometry. Published wash criteria and cutoffs were applied to the results. Additionally, the data were used to formulate new reporting criteria and interpretation guidelines for forensic casework. Applying the wash and reporting criteria, hair that was externally contaminated with COC was distinguished from hair collected from individuals known to have consumed COC. In addition, CE, NC and hydroxycocaine metabolites were only present in COC users' hair and not in drug chemists' hair. When properly applied, the use of an extended wash, along with the reporting criteria defined here, will exclude false-positive results from environmental contact with COC.

  13. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  14. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian

    2014-01-01

    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.

  15. PHASE ANALYSIS AS A TOOL OF PREESTIMATED ANALYSIS OF THE ACTIVITY OF A MULTIFUNCTIONAL CENTER

    Directory of Open Access Journals (Sweden)

    Kovaleva K. A.

    2015-03-01

    Full Text Available The article is devoted to the phase analysis as a tool preprocessor analysis of a multi-purpose center. Consider the time series of the daily number of requests received on the basis of the phase portraits of these time series. These time series have strong properties of cycles and periodicity. Practice has shown that in modern conditions, for example, for the Russian economy with its instability and financial crises, classical economic theory and statistics, built on linear models, turned out to be unproductive. Overview of approaches and economic-mathematical methods preprocessor analysis of evolutionary economic processes and the corresponding time series allows concluding the following: one versatile, satisfying all the requirements, do not possess the shortcomings of the method of analysis and forecasting does not exist. Each approach and each method has its advantages, disadvantages, limits of use. Most of the known methods of forecasting operate detected in the considered time series properties of cycles and periodicity. Thus, the mere presence of a pronounced cyclicity at different levels of the considered hierarchical model of the time series of the number of requests in a multi-purpose center are important indicators of the possibility of constructing an adequate predictive model number of requests in the multi-purpose centre

  16. Analysis of spreadable cheese by Raman spectroscopy and chemometric tools.

    Science.gov (United States)

    Oliveira, Kamila de Sá; Callegaro, Layce de Souza; Stephani, Rodrigo; Almeida, Mariana Ramos; de Oliveira, Luiz Fernando Cappa

    2016-03-01

    In this work, FT-Raman spectroscopy was explored to evaluate spreadable cheese samples. A partial least squares discriminant analysis was employed to identify the spreadable cheese samples containing starch. To build the models, two types of samples were used: commercial samples and samples manufactured in local industries. The method of supervised classification PLS-DA was employed to classify the samples as adulterated or without starch. Multivariate regression was performed using the partial least squares method to quantify the starch in the spreadable cheese. The limit of detection obtained for the model was 0.34% (w/w) and the limit of quantification was 1.14% (w/w). The reliability of the models was evaluated by determining the confidence interval, which was calculated using the bootstrap re-sampling technique. The results show that the classification models can be used to complement classical analysis and as screening methods.

  17. A tool for public analysis of scientific data

    Directory of Open Access Journals (Sweden)

    D Haglin

    2006-01-01

    Full Text Available The scientific method encourages sharing data with other researchers to independently verify conclusions. Currently, technical barriers impede such public scrutiny. A strategy for offering scientific data for public analysis is described. With this strategy, effectively no requirements of software installation (other than a web browser or data manipulation are imposed on other researchers to prepare for perusing the scientific data. A prototype showcasing this strategy is described.

  18. Development of the Expert System Domain Advisor and Analysis Tool

    Science.gov (United States)

    1991-09-01

    analysis. Typical of the current methods in use at this time is the " tarot metric". This method defines a decision rule whose output is whether to go...B - TAROT METRIC B. ::TTRODUCTION The system chart of ESEM, Figure 1, shows the following three risk-based decision points: i. At prolect initiation...34 decisions. B-I 201 PRELIMINARY T" B-I. Evaluais Factan for ES Deyelopsineg FACTORS POSSIBLE VALUE RATINGS TAROT metric (overall suitability) Poor, Fair

  19. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  20. The bioimpedance analysis of a parenchyma of a liver in the conditions of its extensive resection in experiment

    Science.gov (United States)

    Agibalov, D. Y.; Panchenkov, D. N.; Chertyuk, V. B.; Leonov, S. D.; Astakhov, D. A.

    2017-01-01

    The liver failure which is result of disharmony of functionality of a liver to requirements of an organism is the main reason for unsatisfactory results of an extensive resection of a liver. However, uniform effective criterion of definition of degree of a liver failure it isn’t developed now. One of data acquisition methods about a morfo-functional condition of internals is the bioimpedance analysis (BIA) based on impedance assessment (full electric resistance) of a biological tissue. Measurements of an impedance are used in medicine and biology for the characteristic of physical properties of living tissue, studying of the changes bound to a functional state and its structural features. In experimental conditions we carried out an extensive resection of a liver on 27 white laboratory rats of the Vistar line. The comparative characteristic of data of a bioimpedansometriya in intraoperative and after the operational period with the main existing methods of assessment of a functional condition of a liver was carried out. By results of the work performed by us it is possible to claim that the bioimpedance analysis of a liver on the basis of an invasive bioimpedansometriya allows to estimate morphological features and functional activity of a liver before performance of an extensive resection of a liver. The data obtained during scientific work are experimental justification for use of an impedansometriya during complex assessment of functional reserves of a liver. Preliminary data of clinical approbation at a stage of introduction of a technique speak about rather high informational content of a bioimpedansometriya. The subsequent analysis of efficiency of the invasive bioimpedance analysis of a liver requires further accumulation of clinical data. However even at this stage the method showed the prospect for further use in clinical surgical hepathology.

  1. Rapid Prototyping of Hyperspectral Image Analysis Algorithms for Improved Invasive Species Decision Support Tools

    Science.gov (United States)

    Bruce, L. M.; Ball, J. E.; Evangilista, P.; Stohlgren, T. J.

    2006-12-01

    Nonnative invasive species adversely impact ecosystems, causing loss of native plant diversity, species extinction, and impairment of wildlife habitats. As a result, over the past decade federal and state agencies and nongovernmental organizations have begun to work more closely together to address the management of invasive species. In 2005, approximately 500M dollars was budgeted by U.S. Federal Agencies for the management of invasive species. Despite extensive expenditures, most of the methods used to detect and quantify the distribution of these invaders are ad hoc, at best. Likewise, decisions on the type of management techniques to be used or evaluation of the success of these methods are typically non-systematic. More efficient methods to detect or predict the occurrence of these species, as well as the incorporation of this knowledge into decision support systems, are greatly needed. In this project, rapid prototyping capabilities (RPC) are utilized for an invasive species application. More precisely, our recently developed analysis techniques for hyperspectral imagery are being prototyped for inclusion in the national Invasive Species Forecasting System (ISFS). The current ecological forecasting tools in ISFS will be compared to our hyperspectral-based invasives prediction algorithms to determine if/how the newer algorithms enhance the performance of ISFS. The PIs have researched the use of remotely sensed multispectral and hyperspectral reflectance data for the detection of invasive vegetative species. As a result, the PI has designed, implemented, and benchmarked various target detection systems that utilize remotely sensed data. These systems have been designed to make decisions based on a variety of remotely sensed data, including high spectral/spatial resolution hyperspectral signatures (1000's of spectral bands, such as those measured using ASD handheld devices), moderate spectral/spatial resolution hyperspectral images (100's of spectral bands, such

  2. Time-frequency tools of signal processing for EISCAT data analysis

    Directory of Open Access Journals (Sweden)

    J. Lilensten

    Full Text Available We demonstrate the usefulness of some signal-processing tools for the EISCAT data analysis. These tools are somewhat less classical than the familiar periodogram, squared modulus of the Fourier transform, and therefore not as commonly used in our community. The first is a stationary analysis, "Thomson's estimate'' of the power spectrum. The other two belong to time-frequency analysis: the short-time Fourier transform with the spectrogram, and the wavelet analysis via the scalogram. Because of the highly non-stationary character of our geophysical signals, the latter two tools are better suited for this analysis. Their results are compared with both a synthetic signal and EISCAT ion-velocity measurements. We show that they help to discriminate patterns such as gravity waves from noise.

  3. Type Safe Extensible Programming

    Science.gov (United States)

    Chae, Wonseok

    2009-10-01

    Software products evolve over time. Sometimes they evolve by adding new features, and sometimes by either fixing bugs or replacing outdated implementations with new ones. When software engineers fail to anticipate such evolution during development, they will eventually be forced to re-architect or re-build from scratch. Therefore, it has been common practice to prepare for changes so that software products are extensible over their lifetimes. However, making software extensible is challenging because it is difficult to anticipate successive changes and to provide adequate abstraction mechanisms over potential changes. Such extensibility mechanisms, furthermore, should not compromise any existing functionality during extension. Software engineers would benefit from a tool that provides a way to add extensions in a reliable way. It is natural to expect programming languages to serve this role. Extensible programming is one effort to address these issues. In this thesis, we present type safe extensible programming using the MLPolyR language. MLPolyR is an ML-like functional language whose type system provides type-safe extensibility mechanisms at several levels. After presenting the language, we will show how these extensibility mechanisms can be put to good use in the context of product line engineering. Product line engineering is an emerging software engineering paradigm that aims to manage variations, which originate from successive changes in software.

  4. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Eric Moyer

    2016-04-01

    Full Text Available Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  5. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  6. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology.This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols.The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  7. Could wastewater analysis be a useful tool for China?——A review

    Institute of Scientific and Technical Information of China (English)

    Jianfa Gao; Jake O'Brien; Foon Yin Lai; Alexander L.N.van Nuijis; Jun He; Jochen F.Mueller; Jingsha Xu

    2015-01-01

    Analysing wastewater samples is an innovative approach that overcomes many limitations of traditional surveys to identify and measure a range of chemicals that were consumed by or exposed to people living in a sewer catchment area.First conceptualised in 2001,much progress has been made to make wastewater analysis (WWA) a reliable and robust tool for measuring chemical consumption and/or exposure.At the moment,the most popular application of WWA,sometimes referred as sewage epidemiology,is to monitor the consumption of illicit drugs in communities around the globe,including China.The approach has been largely adopted by law enforcement agencies as a device to monitor the temporal and geographical pattems of drug consumption.In the future,the methodology can be extended to other chemicals including biomarkers of population health (e.g.environmental or oxidative stress biomarkers,lifestyle indicators or medications that are taken by different demographic groups) and pollutants that people are exposed to (e.g.polycydic aromatic hydrocarbons,perfluorinated chemicals,and toxic pesticides).The extension of WWA to a huge range of chemicals may give rise to a field called sewage chemical-information mining (SCIM) with unexplored potentials.China has many densely populated cities with thousands of sewage treatment plants which are favourable for applying WWA/SCIM in order to help relevant authorities gather information about illicit drug consumption and population health status.However,there are some prerequisites and uncertainties of the methodology that should be addressed for SCIM to reach its full potential in China.

  8. MATING DESIGNS: HELPFUL TOOL FOR QUANTITATIVE PLANT BREEDING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Athanase Nduwumuremyi

    2013-12-01

    Full Text Available Selection of parental materials and good mating designs in conventional plant breeding are the keys to the successful plant breeding programme. However, there are several factors affecting the choices of mating designs. Mating design refers to the procedure of producing the progenies, in plant breeding, plant breeders and geneticists, theoretically and practically, they use different form of mating designs and arrangements for targeted purpose. The choice of a mating design for estimating genetic variances should be dictated by the objectives of the study, time, space, cost and other biological limitations. In all mating designs, the individuals are taken randomly and crossed to produce progenies which are related to each other as half-sibs or full-sibs. A form of multivariate analysis or the analysis of variance can be adopted to estimate the components of variances. Therefore, this review aimed at highlighting the most used mating design in plant breeding and genetics studies. It provides easy and quick insight of the different form of mating designs and some statistical components for successful plant breeding.

  9. Shell directions as a tool in palaeocurrent analysis

    Science.gov (United States)

    Wendt, Jobst

    1995-03-01

    Conical shells (mostly orthoconic nautiloids, locally gastropods and rugose corals) were used to determine current directions in Ludlovian to upper Famennian cephalopod limestones in the eastern Anti-Atlas of Morocco, Ougarta Aulacogen and Ahnet Basin (both Algeria). Data plots established on 50,413 measurements from 217 localities document rather consistent current patterns which show only minor variations through subsequent intervals. A conspicuous feature are currents derived from pelagic platforms and directed towards adjacent basins. Shell accumulations decrease markedly towards platform margins yielding less distinctive information on current directions which, due to lack of shells, cannot be established in the basins proper. Orientation patterns of styliolinids show such a puzzling variation in adjacent samples that their use for current analysis is doubtful. The same is true for the presumed down-stream position of goniatite apertures which shows a highly variable pattern which is rarely consistent with that of concomitant orthoconic nautiloids. The direction of orthocones in cephalopod limestones onlapping lower Givetian mud mounds and ridges in the Ahnet Basin of Algeria shows a radial pattern which is the result of a mere gravitational deposition of shells on the steep slopes of these buildups. Apart from this exception the applicability of conical shells for current analysis is confirmed.

  10. Data analysis tools for 3D dosimetry: the use of CERR as a platform to integrate and compare measurements and treatment planning information

    Energy Technology Data Exchange (ETDEWEB)

    Deasy, Joe; Apte, Aditya, E-mail: jdeasy@radonc.wustl.ed [Department of Radiation Oncology, Washington University in St. Louis, 4921 Parkview Place, St. Louis, MO 63110 (United States)

    2010-11-01

    CERR, the Computational Environment for Radiotherapy Research, is a mature Matlab-based application that allows users to visualize and analyze 3D treatment planning data exported using standard protocols from clinical treatment planning systems. In this presentation we will give an in-depth discussion of the use of CERR as a tool to analyze measurements compared to expected treatment planning systems. Extensions to CERR allow for straightforward import and registration of experimental data with the planning data. These tools allow users to compare the match between measurement and treatment planning calculation in detail, as provided by profile plots and other tools. Custom Matlab scripts can also be developed, providing complete flexibility in analysis methods. In addition, several offshoot tools have been developed by our group to facilitate dosimetric data analysis, including: A film QA tool, developed under a contract for the Radiological Physics Center (RPC), and a Monte Carlo recalculation tool, also developed under the same contract for the RPC. The film QA tool is meant to facilitate the analysis of film that is irradiated in a phantom. The tool provides a simple method for registering pin-marked points on film to corresponding points in a CT-scanned phantom. Similarly, the locations of point dosimeters can be found. Once registered, data can be compared with the expected treatment plan, interpolated from the converted CERR plan. The dose-distance gamma function is available to quantify agreement. We will discuss the ways these tools can be used to support dosimetry research. All the software discussed here is being made available under open-source licensing.

  11. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  12. Whole Genome Analysis of 132 Clinical Saccharomyces cerevisiae Strains Reveals Extensive Ploidy Variation

    Science.gov (United States)

    Zhu, Yuan O.; Sherlock, Gavin; Petrov, Dmitri A.

    2016-01-01

    Budding yeast has undergone several independent transitions from commercial to clinical lifestyles. The frequency of such transitions suggests that clinical yeast strains are derived from environmentally available yeast populations, including commercial sources. However, despite their important role in adaptive evolution, the prevalence of polyploidy and aneuploidy has not been extensively analyzed in clinical strains. In this study, we have looked for patterns governing the transition to clinical invasion in the largest screen of clinical yeast isolates to date. In particular, we have focused on the hypothesis that ploidy changes have influenced adaptive processes. We sequenced 144 yeast strains, 132 of which are clinical isolates. We found pervasive large-scale genomic variation in both overall ploidy (34% of strains identified as 3n/4n) and individual chromosomal copy numbers (36% of strains identified as aneuploid). We also found evidence for the highly dynamic nature of yeast genomes, with 35 strains showing partial chromosomal copy number changes and eight strains showing multiple independent chromosomal events. Intriguingly, a lineage identified to be baker’s/commercial derived with a unique damaging mutation in NDC80 was particularly prone to polyploidy, with 83% of its members being triploid or tetraploid. Polyploidy was in turn associated with a >2× increase in aneuploidy rates as compared to other lineages. This dataset provides a rich source of information on the genomics of clinical yeast strains and highlights the potential importance of large-scale genomic copy variation in yeast adaptation. PMID:27317778

  13. Extension of monodimensional fuel performance codes to finite strain analysis using a Lagrangian logarithmic strain framework

    Energy Technology Data Exchange (ETDEWEB)

    Helfer, Thomas

    2015-07-15

    Highlights: • A simple extension of standard monodimensional fuel performance codes to finite strain is proposed. • Efficiency and reliability are demonstrated. • The logarithmic strain frameword proposed by Miehe et al. is introduced and discussed. - Abstract: This paper shows how the Lagrangian logarithmic strain framework proposed by Miehe et al. can be used to extend monodimensional fuel performance codes, written in the framework of the infinitesimal strain theory, to be able to cope with large deformation of the cladding, such as the ones observed in reactivity initiated accidents (RIA) or loss-of-coolant accidents (LOCA). We demonstrate that the changes only concern the mechanical behaviour integration step by a straightforward modification of the strains (inputs) and the stress (result). The proposed procedure has been implemented in the open-source MFront code generator developed within the PLEIADES platform to handle mechanical behaviours. Using the Alcyone performance code, we apply this procedure to a simulation case proposed within the framework of a recent benchmark on fuel performance codes by the OECD/NEA.

  14. Comparative analysis on arthroscopic sutures of large and extensive rotator cuff injuries in relation to the degree of osteopenia

    Directory of Open Access Journals (Sweden)

    Alexandre Almeida

    2015-02-01

    Full Text Available OBJECTIVE: To analyze the results from arthroscopic suturing of large and extensive rotator cuff injuries, according to the patient's degree of osteopenia.METHOD: 138 patients who underwent arthroscopic suturing of large and extensive rotator cuff injuries between 2003 and 2011 were analyzed. Those operated from October 2008 onwards formed a prospective cohort, while the remainder formed a retrospective cohort. Also from October 2008 onwards, bone densitometry evaluation was requested at the time of the surgical treatment. For the patients operated before this date, densitometry examinations performed up to two years before or after the surgical treatment were investigated. The patients were divided into three groups. Those with osteoporosis formed group 1 (n = 16; those with osteopenia, group 2 (n = 33; and normal individuals, group 3 (n = 55.RESULTS: In analyzing the University of California at Los Angeles (UCLA scores of group 3 and comparing them with group 2, no statistically significant difference was seen (p = 0.070. Analysis on group 3 in comparison with group 1 showed a statistically significant difference (p = 0.027.CONCLUSION: The results from arthroscopic suturing of large and extensive rotator cuff injuries seem to be influenced by the patient's bone mineral density, as assessed using bone densitometry.

  15. Betweenness as a Tool of Vulnerability Analysis of Power System

    Science.gov (United States)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-12-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  16. Towards understanding the lifespan extension by reduced insulin signaling: bioinformatics analysis of DAF-16/FOXO direct targets in Caenorhabditis elegans

    Science.gov (United States)

    Li, Yan-Hui; Zhang, Gai-Gai

    2016-01-01

    DAF-16, the C. elegans FOXO transcription factor, is an important determinant in aging and longevity. In this work, we manually curated FOXODB http://lyh.pkmu.cn/foxodb/, a database of FOXO direct targets. It now covers 208 genes. Bioinformatics analysis on 109 DAF-16 direct targets in C. elegans found interesting results. (i) DAF-16 and transcription factor PQM-1 co-regulate some targets. (ii) Seventeen targets directly regulate lifespan. (iii) Four targets are involved in lifespan extension induced by dietary restriction. And (iv) DAF-16 direct targets might play global roles in lifespan regulation. PMID:27027346

  17. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  18. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example.......A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...

  19. What reassurances do the community need regarding life extension? Evidence from studies of community attitudes and an analysis of film portrayals.

    Science.gov (United States)

    Underwood, Mair

    2014-04-01

    It is increasingly recognized that community attitudes impact on the research trajectory, entry, and reception of new biotechnologies. Yet biogerontologists have generally been dismissive of public concerns about life extension. There is some evidence that biogerontological research agendas have not been communicated effectively, with studies finding that most community members have little or no knowledge of life extension research. In the absence of knowledge, community members' attitudes may well be shaped by issues raised in popular portrayals of life extension (e.g., in movies). To investigate how popular portrayals of life extension may influence community attitudes, I conducted an analysis of 19 films depicting human life extension across different genres. I focussed on how the pursuit of life extension was depicted, how life extension was achieved, the levels of interest in life extension shown by characters in the films, and the experiences of extended life depicted both at an individual and societal level. This paper compares the results of this analysis with the literature on community attitudes to life extension and makes recommendations about the issues in which the public may require reassurance if they are to support and accept life extension technologies.

  20. Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.

    Science.gov (United States)

    Gray, Peter J.

    This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…

  1. A structured approach to forensic study of explosions: The TNO Inverse Explosion Analysis tool

    NARCIS (Netherlands)

    Voort, M.M. van der; Wees, R.M.M. van; Brouwer, S.D.; Jagt-Deutekom, M.J. van der; Verreault, J.

    2015-01-01

    Forensic analysis of explosions consists of determining the point of origin, the explosive substance involved, and the charge mass. Within the EU FP7 project Hyperion, TNO developed the Inverse Explosion Analysis (TNO-IEA) tool to estimate the charge mass and point of origin based on observed damage

  2. Computational tool for morphological analysis of cultured neonatal rat cardiomyocytes.

    Science.gov (United States)

    Leite, Maria Ruth C R; Cestari, Idágene A; Cestari, Ismar N

    2015-08-01

    This study describes the development and evaluation of a semiautomatic myocyte edge-detector using digital image processing. The algorithm was developed in Matlab 6.0 using the SDC Morphology Toolbox. Its conceptual basis is the mathematical morphology theory together with the watershed and Euclidean distance transformations. The algorithm enables the user to select cells within an image for automatic detection of their borders and calculation of their surface areas; these areas are determined by adding the pixels within each myocyte's boundaries. The algorithm was applied to images of cultured ventricular myocytes from neonatal rats. The edge-detector allowed the identification and quantification of morphometric alterations in cultured isolated myocytes induced by 72 hours of exposure to a hypertrophic agent (50 μM phenylephrine). There was a significant increase in the mean surface area of the phenylephrine-treated cells compared with the control cells (p<;0.05), corresponding to cellular hypertrophy of approximately 50%. In conclusion, this edge-detector provides a rapid, repeatable and accurate measurement of cell surface areas in a standardized manner. Other possible applications include morphologic measurement of other types of cultured cells and analysis of time-related morphometric changes in adult cardiac myocytes.

  3. Job analysis and student assessment tool: perfusion education clinical preceptor.

    Science.gov (United States)

    Riley, Jeffrey B

    2007-09-01

    The perfusion education system centers on the cardiac surgery operating room and the perfusionist teacher who serves as a preceptor for the perfusion student. One method to improve the quality of perfusion education is to create a valid method for perfusion students to give feedback to clinical teachers. The preceptor job analysis consisted of a literature review and interviews with preceptors to list their critical tasks, critical incidents, and cognitive and behavioral competencies. Behaviorally anchored rating traits associated with the preceptors' tasks were identified. Students voted to validate the instrument items. The perfusion instructor rating instrument with a 0-4, "very weak" to "very strong" Likert rating scale was used. The five preceptor traits for student evaluation of clinical instruction (SECI) are as follows: The clinical instructor (1) encourages self-learning, (2) encourages clinical reasoning, (3) meets student's learning needs, (4) gives continuous feedback, and (5) represents a good role model. Scores from 430 student-preceptor relationships for 28 students rotating at 24 affiliate institutions with 134 clinical instructors were evaluated. The mean overall good preceptor average (GPA) was 3.45 +/- 0.76 and was skewed to the left, ranging from 0.0 to 4.0 (median = 3.8). Only 21 of the SECI relationships earned a GPA education program.

  4. IMPORTANT - PERFORMANCE ANALYSIS AS A TOOL IN DESTINATION MARKETING

    Directory of Open Access Journals (Sweden)

    Eleina QIRICI

    2011-06-01

    Full Text Available The Korça Region is located in the Southeast of Albania and borders Greece and Macedonia to the South and the East. It is a mountainous region with two major lakes, Lake Ohrid, the oldest lake in Europe, which is shared with Macedonia and Lake Prespa which is shared with Greece and Macedonia (100km2 in Albania.If we consider the last years, there is an increasing tendency to improve the tourist facilities and to attract the tourist market which is interested for activities in open nature and relax in fresh and pure air. These demands could be met very well in Korca destination which is characterized by suitable climatic conditions and tourist services. Eventually a combination of development of town tourism and tourist villages helped the sustainability of the development of Korca as tourist destination in general.The main purpose of this paper is to present the using of important - performance analysis in marketing destination for the development of tourism.Highlights: (1 the paper considers multifarious goals of the destination management; (2 a computer booking system is used by hotels and guest houses in the region; (3 the relationship between what a tourists wants to find in a destination and that he finds in fact.

  5. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    Science.gov (United States)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  6. ANALYSIS OF USING EFFICIENT LOGGING TOOLS AT PT. PURWA PERMAI IN CENTRAL KALIMANTAN

    Directory of Open Access Journals (Sweden)

    Sona Suhartana

    2008-06-01

    Full Text Available A high log demand that often exceeds its supply capability should be overcome by using appropriate logging  tools. Numerous  kinds and types of logging  tools require  a well planning in their utilization. Number of tools which are greater or fewer than what is actually needed can be disadvantageous  for a company. In relevant to these aspects, a study was carried out at a timber estate in Central Kalimantan  in 2007. The aim of the study was to find out an efficient number  of tools used for logging  in a timber  estate. The analysis was based on the target and realization of the company’s log production. The result revealed that: (1 Optimum number of logging tools depended on production target,  i.e. 41 units  of chainsaws  for felling,  42 units  of farm tractors  for skidding,  9 units of loaders for loading and unloading, and 36 units of trucks for transportation; (2 Number  of logging tools as obtained from all activities  in the field was fewer than that from  the analysis based on production target and realization. This condition  indicated that number of logging tools used in the company was not yet efficient.

  7. Designing budgeting tool and sensitivity analysis for a start-up. : Case: Witrafi Oy

    OpenAIRE

    Arafath, Muhammad

    2014-01-01

    This study presents a thesis on the topic of designing budgeting tool and sensitivi-ty analysis for the commissioning company. The commissioning company is a Finnish Star-up and currently focusing on developing Intelligent Transport Sys-tems by using network based parking system. The aim of this thesis is to provide a ready-made budgeting tool therefore, the commissioning company can use the tool for its own purpose. This is a product-oriented thesis and it includes five project tasks. Pro...

  8. The analysis of the functionality of modern systems, methods and scheduling tools

    Directory of Open Access Journals (Sweden)

    Abramov Ivan

    2016-01-01

    Full Text Available Calendar planning is a key tool for efficient management applied in many industries: power, oil & gas, metallurgy, and construction. As a result of the growing complexity of projects and arising need for improvement of their efficiency, a large number of software tools for high-quality calendar planning appear. Construction companies are facing the challenge of optimum selection of such tools (programs for distribution of limited resources in time.The article provides analysis of the main software packages and their capabilities enabling improvement of project implementation efficiency.

  9. An Intelligent Tool to support Requirements Analysis and Conceptual Design of Database Design

    Institute of Scientific and Technical Information of China (English)

    王能斌; 刘海青

    1991-01-01

    As an application of artificial intelligence and expert system technology to database design,this paper presents an intelligent design tool NITDT,which comprises a requirements specification language NITSL,a knowledge representation language NITKL,and an inference engine with uncertainty reasoning capability.NITDT now covers the requirements analysis and conceptual design of database design.However,it is possible to be integrated with another database design tool,NITDBA,developed also at NIT to become an integrated design tool supporting the whole process of database design.

  10. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  11. The Fuzzy Cluster Analysis in Identification of Key Temperatures in Machine Tool

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The thermal-induced error is a very important sour ce of machining errors of machine tools. To compensate the thermal-induced machin ing errors, a relationship model between the thermal field and deformations was needed. The relationship can be deduced by virtual of FEM (Finite Element Method ), ANN (Artificial Neural Network) or MRA (Multiple Regression Analysis). MR A is on the basis of a total understanding of the temperature distribution of th e machine tool. Although the more the temperatures measu...

  12. SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale

    Science.gov (United States)

    2014-06-01

    SiLK : A Tool Suite for Unsampled Network Flow Analysis at Scale Mark Thomas, Leigh Metcalf, Jonathan Spring, Paul Krystosek, Katherine Prevost netsa...make the problem manageable, but sampling unacceptably reduces the fidelity of ana- lytic conclusions. In this paper we discuss SiLK , a tool suite...created to analyze this high-volume data source without sampling. SiLK implementation and archi- tectural design are optimized to manage this Big Data

  13. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole;

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...

  14. TAPPS Release 1: Plugin-Extensible Platform for Technical Analysis and Applied Statistics

    Directory of Open Access Journals (Sweden)

    Justin Sam Chew

    2016-01-01

    Full Text Available We present the first release of TAPPS (Technical Analysis and Applied Statistics System; a Python implementation of a thin software platform aimed towards technical analyses and applied statistics. The core of TAPPS is a container for 2-dimensional data frame objects and a TAPPS command language. TAPPS language is not meant to be a programming language for script and plugin development but for the operational purposes. In this aspect, TAPPS language takes on the flavor of SQL rather than R, resulting in a shallower learning curve. All analytical functions are implemented as plugins. This results in a defined plugin system, which enables rapid development and incorporation of analysis functions. TAPPS Release 1 is released under GNU General Public License 3 for academic and non-commercial use. TAPPS code repository can be found at http://github.com/mauriceling/tapps.

  15. Synchrotron radiation micro-X-ray fluorescence analysis: A tool to increase accuracy in microscopic analysis

    CERN Document Server

    Adams, F

    2003-01-01

    Microscopic X-ray fluorescence (XRF) analysis has potential for development as a certification method and as a calibration tool for other microanalytical techniques. The interaction of X-rays with matter is well understood and modelling studies show excellent agreement between experimental data and calculations using Monte Carlo simulation. The method can be used for a direct iterative calculation of concentrations using available high accuracy physical constants. Average accuracy is in the range of 3-5% for micron sized objects at concentration levels of less than 1 ppm with focused radiation from SR sources. The end-station ID18F of the ESRF is dedicated to accurate quantitative micro-XRF analysis including fast 2D scanning with collection of full X-ray spectra. Important aspects of the beamline are the precise monitoring of the intensity of the polarized, variable energy beam and the high reproducibility of the set-up measurement geometry, instrumental parameters and long-term stability.

  16. Phylogenomic Analysis Reveals Extensive Phylogenetic Mosaicism in the Human GPCR Superfamily

    Directory of Open Access Journals (Sweden)

    Mathew Woodwark

    2007-01-01

    Full Text Available A novel high throughput phylogenomic analysis (HTP was applied to the rhodopsin G-protein coupled receptor (GPCR family. Instances of phylogenetic mosaicism between receptors were found to be frequent, often as instances of correlated mosaicism and repeated mosaicism. A null data set was constructed with the same phylogenetic topology as the rhodopsin GPCRs. Comparison of the two data sets revealed that mosaicism was found in GPCRs in a higher frequency than would be expected by homoplasy or the effects of topology alone. Various evolutionary models of differential conservation, recombination and homoplasy are explored which could result in the patterns observed in this analysis. We find that the results are most consistent with frequent recombination events. A complex evolutionary history is illustrated in which it is likely frequent recombination has endowed GPCRs with new functions. The pattern of mosaicism is shown to be informative for functional prediction for orphan receptors. HTP analysis is complementary to conventional phylogenomic analyses revealing mosaicism that would not otherwise have been detectable through conventional phylogenetics.

  17. The Digital Shoreline Analysis System (DSAS) Version 4.0 - An ArcGIS Extension for Calculating Shoreline Change

    Science.gov (United States)

    Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan

    2009-01-01

    The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.

  18. 面向对象技术的可拓分析%Extension Analysis for Object-oriented Programming Technology

    Institute of Scientific and Technical Information of China (English)

    刘汉龙

    2001-01-01

    This article introduces the extension analysis for theobject-oriented programming technology, discusses the object matter-element's “multi-value",and presents the transformation conditions of the object matter-element and the rules of decomposition and combination.The compatible and non-comaptible problem in computer development is considered by man-machine interface's DVI. At the same time,the concept of “replace prices” is presented.Taking the word processing software for example,extension analysis used in the optimization of object-oriented programming works well.%在计算机面向对象技术中引入可拓分析,探讨了作为对象物元的“多重价值”,提出了这种对象物元的变换条件、对象物元分解和组合的规则,并从人机接口中的DVI来透视计算机发展中“相容与不相容问题”,同时给出了“替代价格”的概念,以字处理软件为例进行分析,说明可拓分析用于面向对象程序设计的优化能产生良好的效果.

  19. The Digital Shoreline Analysis System (DSAS) Version 4.0 - An ArcGIS extension for calculating shoreline change

    Science.gov (United States)

    Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan

    2009-01-01

    The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.

  20. ThermoData Engine: Extension to Solvent Design and Multi-component Process Stream Property Calculations with Uncertainty Analysis

    DEFF Research Database (Denmark)

    Diky, Vladimir; Chirico, Robert D.; Muzny, Chris

    ThermoData Engine (TDE, NIST Standard Reference Databases 103a and 103b) is the first product that implements the concept of Dynamic Data Evaluation in the fields of thermophysics and thermochemistry, which includes maintaining the comprehensive and up-to-date database of experimentally measured...... property values and expert system for data analysis and generation of recommended property values at the specified conditions along with uncertainties on demand. The most recent extension of TDE covers solvent design and multi-component process stream property calculations with uncertainty analysis....... Selection is made by best efficiency (depending on the task, solubility, selectivity, or distribution coefficient, etc.) and matching other requirements requested by the user. At user’s request, efficiency criteria are evaluated based on experimental data for binary mixtures or predictive models (UNIFAC...

  1. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  2. MEL-IRIS: An Online Tool for Audio Analysis and Music Indexing

    Directory of Open Access Journals (Sweden)

    Dimitrios Margounakis

    2009-01-01

    Full Text Available Chroma is an important attribute of music and sound, although it has not yet been adequately defined in literature. As such, it can be used for further analysis of sound, resulting in interesting colorful representations that can be used in many tasks: indexing, classification, and retrieval. Especially in Music Information Retrieval (MIR, the visualization of the chromatic analysis can be used for comparison, pattern recognition, melodic sequence prediction, and color-based searching. MEL-IRIS is the tool which has been developed in order to analyze audio files and characterize music based on chroma. The tool implements specially designed algorithms and a unique way of visualization of the results. The tool is network-oriented and can be installed in audio servers, in order to manipulate large music collections. Several samples from world music have been tested and processed, in order to demonstrate the possible uses of such an analysis.

  3. AN ANALYSIS OF THE CAUSES OF PRODUCT DEFECTS USING QUALITY MANAGEMENT TOOLS

    Directory of Open Access Journals (Sweden)

    Katarzyna MIDOR

    2014-10-01

    Full Text Available To stay or strengthen its position on the market, a modern business needs to follow the principles of quality control in its actions. Especially important is the Zero Defects concept developed by Philip Crosby, which means flawless production. The concept consists in preventing the occurrence of defects and flaws in all production stages. To achieve that, we must, among other things, make use of quality management tools. This article presents an analysis of the reasons for the return of damaged or faulty goods in the automotive industry by means of quality management tools such as the Ishikawa diagram and Pareto analysis, which allow us to identify the causes of product defectiveness. Based on the results, preventive measures have been proposed. The actions presented in this article and the results of the analysis prove the effectiveness of the aforementioned quality management tools.

  4. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  5. Discriminant Analysis of a Spatially Extensive Landsliding Inventory for the Haida Gwaii, British Columbia, Canada

    Science.gov (United States)

    Sjogren, D.; Martin, Y. E.; Jagielko, L.

    2010-12-01

    Gimbarzevsky (1988) collected an exceptional landsliding inventory for the Haida Gwaii, British Columbia (formerly called the Queen Charlotte Islands). This data base includes more than 8 000 landsliding vectors, with an areal coverage of about 10 000 km2. Unfortunately, this landsliding inventory was never published in the referred literature, despite its regional significance. The data collection occurred prior to widespread use of GIS technologies in landsliding analysis, thus restricting the types of analyses that were undertaken at the time relative to what is possible today. Gimbarzevsky identified the landsliding events from 1:50 000 aerial photographs, and then transferred the landslide vectors to NTS map sheets. In this study, we digitized the landslide vectors from these original map sheets and connected each vector to a digital elevation model. Lengths of landslide vectors were then compared to results of Rood (1984), whose landsliding inventory for the Haida Gwaii relied on larger-scale aerial photographs (~ 1:13 000). A comparison of the two data bases shows that Rood’s inventory contains a more complete record of smaller landslides, whereas Gimbarzevsky’s inventory provides a much better statistical representation of less frequently occurring, medium to large landslide events. We then apply discriminant analysis to the Gimbarzevsky data base to assess which of a set of ten predictor variables, selected on the basis of mechanical theory, best predict failed vs. unfailed locations in the landscape (referred to as the grouping variable in discriminant analysis). Certain predictor variables may be cross-correlated, and any one particular variable may be related to several aspects of mechanical theory (for example, a particular variable may affect various components of shear stress and/or shear strength); it is important to recognize that the significance of particular groupings may reflect this information. Eight of the original variables were found

  6. Reducing incomparability in multicriteria decision analysis: an extension of the ZAPROS method

    Directory of Open Access Journals (Sweden)

    Isabelle Tamanini

    2011-08-01

    Full Text Available The ZAPROS method belongs to the Verbal Decision Analysis framework, and it aims at solving decision-making problems in a more realistic way from the decision maker's point of view. Quantitative methods can lead to loss of information when attempting to assign accurate measurements to verbal values. However, the feature exposed can cause a decrease in the comparison power of the method, making the incomparability cases between the alternatives unavoidable and leading to an unsatisfactory result of the problem, thus leading to an unsatisfactory outcome of the problem. Considering the limitation exposed, this work presents a methodological approach structured on the ZAPROS method to assist in decision-making in the Verbal Decision Analysis. The aim is to produce a complete result which is satisfactory to the decision maker, through the reduction of the cases of incomparability between alternatives. The modifications were mainly applied to the comparison of alternatives process and did not change the computational complexity of the method.

  7. A study and extension of second-order blind source separation to operational modal analysis

    Science.gov (United States)

    Antoni, J.; Chauhan, S.

    2013-02-01

    Second-order blind source separation (SOBSS) has gained recent interest in operational modal analysis (OMA), since it is able to separate a set of system responses into modal coordinates from which the system poles can be extracted by single-degree-of-freedom techniques. In addition, SOBSS returns a mixing matrix whose columns are the estimates of the system mode shapes. The objective of this paper is threefold. First, a theoretical analysis of current SOBSS methods is conducted within the OMA framework and its precise conditions of applicability are established. Second, a new separation method is proposed that fixes current limitations of SOBSS: It returns estimate of complex mode shapes, it can deal with more active modes than the number of available sensors, and it shows superior performance in the case of heavily damped and/or strongly coupled modes. Third, a theoretical connection is drawn between SOBSS and stochastic subspace identification (SSI), which stands as one of the points of reference in OMA. All approaches are finally compared by means of numerical simulations.

  8. Metabolomics as an extension of proteomic analysis: study of acute kidney injury.

    Science.gov (United States)

    Portilla, Didier; Schnackenberg, Laura; Beger, Richard D

    2007-11-01

    Although proteomics studies the global expression of proteins, metabolomics characterizes and quantifies their end products: the metabolites, produced by an organism under a certain set of conditions. From this perspective it is apparent that proteomics and metabolomics are complementary and when joined allow a fuller appreciation of an organism's phenotype. Our studies using (1)H-nuclear magnetic resonance spectroscopic analysis showed the presence of glucose, amino acids, and trichloroacetic acid cycle metabolites in the urine after 48 hours of cisplatin administration. These metabolic alterations precede changes in serum creatinine. Biochemical studies confirmed the presence of glucosuria, but also showed the accumulation of nonesterified fatty acids, and triglycerides in serum, urine, and kidney tissue, despite increased levels of plasma insulin. These metabolic alterations were ameliorated by the use of fibrates. We propose that the injury-induced metabolic profile may be used as a biomarker of cisplatin-induced nephrotoxicity. These studies serve to illustrate that metabolomic studies add insight into pathophysiology not provided by proteomic analysis alone.

  9. Analysis of global gene expression in Brachypodium distachyon reveals extensive network plasticity in response to abiotic stress.

    Directory of Open Access Journals (Sweden)

    Henry D Priest

    Full Text Available Brachypodium distachyon is a close relative of many important cereal crops. Abiotic stress tolerance has a significant impact on productivity of agriculturally important food and feedstock crops. Analysis of the transcriptome of Brachypodium after chilling, high-salinity, drought, and heat stresses revealed diverse differential expression of many transcripts. Weighted Gene Co-Expression Network Analysis revealed 22 distinct gene modules with specific profiles of expression under each stress. Promoter analysis implicated short DNA sequences directly upstream of module members in the regulation of 21 of 22 modules. Functional analysis of module members revealed enrichment in functional terms for 10 of 22 network modules. Analysis of condition-specific correlations between differentially expressed gene pairs revealed extensive plasticity in the expression relationships of gene pairs. Photosynthesis, cell cycle, and cell wall expression modules were down-regulated by all abiotic stresses. Modules which were up-regulated by each abiotic stress fell into diverse and unique gene ontology GO categories. This study provides genomics resources and improves our understanding of abiotic stress responses of Brachypodium.

  10. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    Science.gov (United States)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  11. Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

  12. SDA-Based Diagnostic and Analysis Tools for Collider Run II

    CERN Document Server

    Papadimitriou, Vaia; Lebrun, Paul; Panacek, S; Slaughter, Anna Jean; Xiao, Aimin

    2005-01-01

    Operating and improving the understanding of the Fermilab Accelerator Complex for the colliding beam experiments requires advanced software methods and tools. The Shot Data Acquisition and Analysis (SDA) has been developed to fulfill this need. Data is stored in a relational database, and is served to programs and users via Web-based tools. Summary tables are systematically generated during and after a store. These tables, the Supertable, and the Recomputed Emittances and Recomputed Intensity tables are discussed here. This information is also accesible in JAS3 (Java Analysis Studio version 3).

  13. Comprehensive analysis of RNA-Seq data reveals extensive RNA editing in a human transcriptome

    DEFF Research Database (Denmark)

    Peng, Zhiyu; Cheng, Yanbing; Tan, Bertrand Chin-Ming

    2012-01-01

    a computational pipeline that carefully controls for false positives while calling RNA editing events from genome and whole-transcriptome data of the same individual. We identified 22,688 RNA editing events in noncoding genes and introns, untranslated regions and coding sequences of protein-coding genes. Most......RNA editing is a post-transcriptional event that recodes hereditary information. Here we describe a comprehensive profile of the RNA editome of a male Han Chinese individual based on analysis of ∼767 million sequencing reads from poly(A)(+), poly(A)(-) and small RNA samples. We developed...... changes (∼93%) converted A to I(G), consistent with known editing mechanisms based on adenosine deaminase acting on RNA (ADAR). We also found evidence of other types of nucleotide changes; however, these were validated at lower rates. We found 44 editing sites in microRNAs (miRNAs), suggesting a potential...

  14. Analysis and Extension of the PCA Method, Estimating a Noise Curve from a Single Image

    Directory of Open Access Journals (Sweden)

    Miguel Colom

    2016-12-01

    Full Text Available In the article 'Image Noise Level Estimation by Principal Component Analysis', S. Pyatykh, J. Hesser, and L. Zheng propose a new method to estimate the variance of the noise in an image from the eigenvalues of the covariance matrix of the overlapping blocks of the noisy image. Instead of using all the patches of the noisy image, the authors propose an iterative strategy to adaptively choose the optimal set containing the patches with lowest variance. Although the method measures uniform Gaussian noise, it can be easily adapted to deal with signal-dependent noise, which is realistic with the Poisson noise model obtained by a CMOS or CCD device in a digital camera.

  15. Extensions to Regret-based Decision Curve Analysis: An application to hospice referral for terminal patients

    Directory of Open Access Journals (Sweden)

    Tsalatsanis Athanasios

    2011-12-01

    Full Text Available Abstract Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA. We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical

  16. Tools for analysis and scenario-development for climate impacts applications from large multi-model ensembles data distributions

    Science.gov (United States)

    Ammann, C. M.; Vigh, J. L.; Brown, B.; Gilleland, E.; Fowler, T.; Kaatz, L.; Buja, L.; Rood, R. B.; Barsugli, J. J.; Guentchev, G.

    2013-12-01

    For climate change impact applications to properly incorporate the current state of science and its associated broad range of data, good intentions are commonly confronted by the realities of interoperability issues, by the lack of transparency of assumptions behind complex modeling and experimental design frameworks, and even by ordinary data handling and management hurdles due to the enormous volumes of data. Yet, well-informed choices in data selection and insightful uncertainty exploration need to be able to draw from all this information to develop suitable products and indices for the applications. Because climate impacts are often complex and require careful integration into a multi-stressor environment, simple one-way delivery of climate data will often miss the specific needs on the ground. Product development and knowledge integration are a inherently iterative processes, requiring direct collaborations between the scientists and the application specialists. To be successful in efficiently and effectively extracting useful information from the climate change information archives, efficient tools for analysis and scenario-development are necessary. Standardized intercomparison of climate data is a necessary foundation to identify strategies to investigate probabilistic outcomes and to explore process-driven outcomes that are likely, plausible or maybe just possible. Based on the extensive evaluation tools for downscaled climate data provided by the National Climate Predictions and Projections (NCPP) platform, we illustrate analysis and scenario-development tools that help explore different sources of uncertainty and allow for sampling of realistic time-series to assess potential impacts for targeted applications. This work combines rigorous evaluation and validation of climate projection data and facilitates assessments of scenarios in context of observed variability and change.

  17. Mathematical support for automated geometry analysis of lathe machining of oblique peakless round–nose tools

    Science.gov (United States)

    Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.

    2017-01-01

    Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.

  18. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  19. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  20. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    Science.gov (United States)

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.