WorldWideScience

Sample records for analysis tool extensions

  1. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  2. An extensive (co-expression analysis tool for the cytochrome P450 superfamily in Arabidopsis thaliana

    Directory of Open Access Journals (Sweden)

    Provart Nicholas J

    2008-04-01

    Full Text Available Abstract Background Sequencing of the first plant genomes has revealed that cytochromes P450 have evolved to become the largest family of enzymes in secondary metabolism. The proportion of P450 enzymes with characterized biochemical function(s is however very small. If P450 diversification mirrors evolution of chemical diversity, this points to an unexpectedly poor understanding of plant metabolism. We assumed that extensive analysis of gene expression might guide towards the function of P450 enzymes, and highlight overlooked aspects of plant metabolism. Results We have created a comprehensive database, 'CYPedia', describing P450 gene expression in four data sets: organs and tissues, stress response, hormone response, and mutants of Arabidopsis thaliana, based on public Affymetrix ATH1 microarray expression data. P450 expression was then combined with the expression of 4,130 re-annotated genes, predicted to act in plant metabolism, for co-expression analyses. Based on the annotation of co-expressed genes from diverse pathway annotation databases, co-expressed pathways were identified. Predictions were validated for most P450s with known functions. As examples, co-expression results for P450s related to plastidial functions/photosynthesis, and to phenylpropanoid, triterpenoid and jasmonate metabolism are highlighted here. Conclusion The large scale hypothesis generation tools presented here provide leads to new pathways, unexpected functions, and regulatory networks for many P450s in plant metabolism. These can now be exploited by the community to validate the proposed functions experimentally using reverse genetics, biochemistry, and metabolic profiling.

  3. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    OpenAIRE

    Taverner, Tom; Karpievitch, Yuliya V.; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-01-01

    Motivation: The size and complex nature of mass spectrometry-based proteomics datasets motivate development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing func...

  4. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    Energy Technology Data Exchange (ETDEWEB)

    Taverner, Thomas; Karpievitch, Yuliya; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-09-15

    Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

  5. Tools for Creating Mobile Applications for Extension

    Science.gov (United States)

    Drill, Sabrina L.

    2012-01-01

    Considerations and tools for developing mobile applications for Extension include evaluating the topic, purpose, and audience. Different computing platforms may be used, and apps designed as modified Web pages or implicitly programmed for a particular platform. User privacy is another important consideration, especially for data collection apps.…

  6. Versatile and Extensible, Continuous-Thrust Trajectory Optimization Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop an innovative, versatile and extensible, continuous-thrust trajectory optimization tool for planetary mission design and optimization of...

  7. Physics analysis tools

    International Nuclear Information System (INIS)

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  8. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  9. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  10. Extensive analysis of hydrogen costs

    Energy Technology Data Exchange (ETDEWEB)

    Guinea, D.M.; Martin, D.; Garcia-Alegre, M.C.; Guinea, D. [Consejo Superior de Investigaciones Cientificas, Arganda, Madrid (Spain). Inst. de Automatica Industrial; Agila, W.E. [Acciona Infraestructuras, Alcobendas, Madrid (Spain). Dept. I+D+i

    2010-07-01

    Cost is a key issue in the spreading of any technology. In this work, the cost of hydrogen is analyzed and determined, for hydrogen obtained by electrolysis. Different contributing partial costs are taken into account to calculate the hydrogen final cost, such as energy and electrolyzers taxes. Energy cost data is taken from official URLs, while electrolyzer costs are obtained from commercial companies. The analysis is accomplished under different hypothesis, and for different countries: Germany, France, Austria, Switzerland, Spain and the Canadian region of Ontario. Finally, the obtained costs are compared to those of the most used fossil fuels, both in the automotive industry (gasoline and diesel) and in the residential sector (butane, coal, town gas and wood), and the possibilities of hydrogen competing against fuels are discussed. According to this work, in the automotive industry, even neglecting subsidies, hydrogen can compete with fossil fuels. Hydrogen can also compete with gaseous domestic fuels. Electrolyzer prices were found to have the highest influence on hydrogen prices. (orig.)

  11. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  12. Neutron multiplicity analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Scott L [Los Alamos National Laboratory

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  13. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  14. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  15. Producing Organic Cotton: A Toolkit - Crop Guide, Projekt guide, Extension tools

    OpenAIRE

    Eyhorn, Frank

    2005-01-01

    The CD compiles the following extension tools on organic cotton: Organic Cotton Crop Guide, Organic Cotton Training Manual, Soil Fertility Training Manual, Organic Cotton Project Guide, Record keeping tools, Video "Organic agriculture in the Nimar region", Photos for illustration.

  16. Failure Environment Analysis Tool (FEAT)

    Science.gov (United States)

    Lawler, D. G.

    1991-01-01

    Information is given in viewgraph form on the Failure Environment Analysis Tool (FEAT), a tool designed to demonstrate advanced modeling and analysis techniques to better understand and capture the flow of failures within and between elements of the Space Station Freedom (SSF) and other large complex systems. Topics covered include objectives, development background, the technical approach, SSF baseline integration, and FEAT growth and evolution.

  17. Information and Communication technologies as agricultural extension tools

    OpenAIRE

    Anatoli Marantidou; Anastasios Michailidis; Afroditi Papadaki-Klavdianou

    2011-01-01

    Knowledge and innovation society are becoming priorities to the welfare and quality of life of the rural population. This is based substantially on scientific and technological progress. Information and Communication Technologies (ICTs) accelerate rural development by contributing to more efficient management and rapid knowledge dissemination. ICTs are defined as a different set of technological tools and resources used for communication and for the creation, processing, dissemination, storag...

  18. Reliability Centered Maintenance as a tool for plant life extension

    International Nuclear Information System (INIS)

    Currently in the nuclear industry there is a growing interest in lowering the cost and complexity of maintenance activities while at the same time improving plant reliability and safety in an effort to prepare for the technical and regulatory challenges of life extension. This seemingly difficult task is being aided by the introduction of a maintenance philosophy developed originally by the airline industry and subsequently applied with great success both in that industry and the U.S. military services. Reliability Centered Maintenance (RCM), in its basic form, may be described as a consideration of reliability and maintenance problems from a systems level approach, allowing a focus on preservation of system function as the aim of a maintenance program optimized for both safety and economics. It is this systematic view of plant maintenance, with the emphasis on overall functions rather than individual parts and components which sets RCM apart from past nuclear plant maintenance philosophies. It is also the factor which makes application of RCM an ideal first step in development of strategies for life extension, both for aging plants, and for plants just beginning their first license term. (J.P.N.)

  19. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;

    2014-01-01

    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract......, analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  20. AvoPlot: An extensible scientific plotting tool based on matplotlib

    Directory of Open Access Journals (Sweden)

    Nial Peters

    2014-02-01

    Full Text Available AvoPlot is a simple-to-use graphical plotting program written in Python and making extensive use of the matplotlib plotting library. It can be found at a href="http://code.google.com/p/avoplot/" http://code.google.com/p/avoplot/ . In addition to providing a user-friendly interface to the powerful capabilities of the matplotlib library, it also offers users the possibility of extending its functionality by creating plug-ins. These can import specific types of data into the interface and also provide new tools for manipulating them. In this respect, AvoPlot is a convenient platform for researchers to build their own data analysis tools on top of, as well as being a useful standalone program.

  1. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...

  2. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  3. Space debris mitigation: extension of the SDM tool - executive summary

    OpenAIRE

    Anselmo, Luciano; Cordelli, Alessandro; Pardini, Carmen; Rossi, Alessandro

    2000-01-01

    The Space Debris Mitigation long-term analysis program (SDM, Version 2.0) has been developed to study the long-term evolution of orbital debris and to evaluate the effectiveness of mitigation measures

  4. Atlas Distributed Analysis Tools

    Science.gov (United States)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  5. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  6. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  7. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  8. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    Science.gov (United States)

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  9. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  10. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  11. Tools for income mobility analysis

    OpenAIRE

    Philippe Kerm

    2002-01-01

    A set of Stata routines to help analysis of `income mobility' are presented and illustrated. Income mobility is taken here as the pattern of income change from one time period to another within an income distribution. Multiple approaches have been advocated to assess the magnitude of income mobility. The macros presented provide tools for estimating several measures of income mobility, e.g. the Shorrocks (JET 1978) or King (Econometrica 1983) indices or summary statistics for transition matri...

  12. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  13. ANALYSIS OF THE PROTECTED EXTENSIBLE AUTHENTICATION PROTOCOL

    Directory of Open Access Journals (Sweden)

    Amit Rana

    2012-09-01

    Full Text Available The Internet Engineering Task Force (IETF has proposednew protocols for highly secured wireless networking. Thepurpose of this paper is to implement one such proposedsecurity protocol - PEAP (Protected ExtensibleAuthentication Protocol [1]. PEAP was jointly developedby Microsoft, Cisco and RSA security. The protocolimplementation is done on the server end of a Client/Servernetwork model on a RADIUS server (RemoteAuthentication Dial-in User Service. The proposedprotocol - PEAP provides for Client identity protection andkey generation thus preventing unauthorized user accessand protecting or encrypting the data against maliciousactivities.

  14. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  15. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  16. Hierarchical task analysis: Developments, applications and extensions

    OpenAIRE

    Stanton, Neville A.

    2006-01-01

    Hierarchical task analysis (HTA) is a core ergonomics approach with a pedigree of over 30 years continuous use. At its heart, HTA is based upon a theory of performance and has only three governing principles. Originally developed as a means of determining training requirements, there was no way the initial pioneers of HTA could have foreseen the extent of its success. HTA has endured as a way of representing a system sub-goal hierarchy for extended analysis. It has been used for a range of ap...

  17. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  18. A Meta-Analysis of Extensive Reading Research

    Science.gov (United States)

    Nakanishi, Takayuki

    2015-01-01

    The purposes of this study were to investigate the overall effectiveness of extensive reading, whether learners' age impacts learning, and whether the length of time second language learners engage in extensive reading influences test scores. The author conducted a meta-analysis to answer research questions and to identify future research…

  19. Exploration tools in formal concept analysis

    OpenAIRE

    Stumme, Gerd

    1996-01-01

    The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.

  20. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar;

    2002-01-01

    This paper describes the application of temporal maximum autocorrelation factor analysis to global monthly mean values of 1996-1997 sea surface temperature (SST) and sea surface height (SSH) data. This type of analysis can be considered as an extension of traditional empirical orthogonal function...... (EOF) analysis, which provides a non-temporal analysis of one variable over time. The temporal extension proves its strength in separating the signals at different periods in an analysis of relevant oceanographic properties related to one of the largest El Niño events ever recorded....

  1. Syntax analysis of an Algol extension for a vector machine

    International Nuclear Information System (INIS)

    This research thesis addresses the study of the definition of vector extensions aimed at extending the Algol 60 programming language. This syntax analysis is the first of the three main components of a future extended Algol compiler which is to operate on a vector machine, the other components being the semantic analysis and the code generation. The author presents the characteristics of the vector machine and the most interesting statements, defines the syntax of Algol and of vector extensions, and presents some examples. He presents the general philosophy adopted for this analysis, some options adopted for the definition of algorithms within the frame of a lexicographic analysis and of syntax analysis

  2. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar;

    2002-01-01

    with 24 variables. This type of analysis can be considered as an extension of traditional empirical orthogonal function (EOF) analysis which provides a marginal analysis of one variable over time. The motivation for using a bivariate extention stems from the fact that the two fields are interrelated as...

  3. Multi-mission telecom analysis tool

    Science.gov (United States)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  4. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    /steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  5. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program

  6. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    Directory of Open Access Journals (Sweden)

    Brinkley James F

    2007-10-01

    Full Text Available Abstract Background Three-dimensional (3-D visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  7. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  8. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  9. A New Web-based Tool for Aerosol Data Analysis: the AERONET Data Synergy Tool

    Science.gov (United States)

    Giles, D. M.; Holben, B. N.; Slutsker, I.; Welton, E. J.; Chin, M.; Schmaltz, J.; Kucsera, T.; Diehl, T.

    2006-12-01

    The Aerosol Robotic Network (AERONET) provides important aerosol microphysical and optical properties via an extensive distribution of continental sites and sparsely-distributed coastal and oceanic sites among the major oceans and inland seas. These data provide only spatial point measurements while supplemental data are needed for a complete aerosol analysis. Ancillary data sets (e.g., MODIS true color imagery and back trajectory analyses) are available by navigating to several web data sources. In an effort to streamline aerosol data discovery and analysis, a new web data tool called the "AERONET Data Synergy Tool" was launched from the AERONET web site. This tool provides access to ground-based (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. The Data Synergy Tool user can access these data sources to obtain properties such as the optical depth, composition, absorption, size, spatial and vertical distribution, and source region of aerosols. AERONET Ascension Island and COVE platform site data will be presented to highlight the Data Synergy Tool capabilities in analyzing urban haze, smoke, and dust aerosol events over the ocean. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  10. ADAPT-A Drainage Analysis Planning Tool

    OpenAIRE

    Boelee, Leonore; Kellagher, Richard

    2015-01-01

    HR Wallingford are a partner in the EU funded TRUST project. They are involved in Work package 4.3 Wastewater and stormwater systems, to produce a model and report on a system sustainability analysis and potential for improvements for stormwater systems as Deliverable 4.3.2. This report is deliverable 4.3.2. It details the development of the tool ADAPT (A Drainage Analysis and Planning Tool). The objective of the tool is to evaluate the improvement requirements to a stormwat...

  11. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  12. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis of...

  13. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  14. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results.......The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...

  15. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  16. Tools & Strategies for Social Data Analysis

    OpenAIRE

    Willett, Wesley Jay

    2012-01-01

    Data analysis is often a complex, iterative process that involves a variety of stakeholders and requires a range of technical and professional competencies. However, in practice, tools for visualizing,analyzing, and communicating insights from data have primarily been designed to support individual users.In the past decade a handful of research systems like sense.us and Many Eyes have begun to explore how web-based visualization tools can allow larger groups of users to participate in analyse...

  17. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  18. Performance analysis of GYRO: a tool evaluation

    International Nuclear Information System (INIS)

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  19. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  20. Teaching Methods and Tools Used In Food Safety Extension Education Programs in the North Central Region of the United States

    Directory of Open Access Journals (Sweden)

    Robert A. Martin

    2011-09-01

    Full Text Available One of the ways to ensure food safety is to educate thepublic. Of the organizations providing food safety educationin the United States (U.S., the Cooperative Extension System(CES is one of the most reliable. The effectiveness CESprograms depends not only on what is being taught but also onhow it is taught. Both a needs-based curriculum and how thatcurriculum is delivered are equally important. This descriptivecross-sectional study using a disproportional stratified randomsample identified the teaching methods and tools being used byfood safety extension educators of the CES of North CentralRegion (NCR. A Likert-type scale administered to extensioneducators revealed that they were adopting a balanced use ofteaching methods and tools, and using learner-centered teachingmethods in their programs. However, distance education, casestudies and podcasts, which are commonly used in educationprograms, were not being used extensively. We recommend thatfood safety extension educators of NCR should increase the useof these two teaching methods and tool while continuing to usethe current ones. This study has implications for improving foodsafety education delivery to clients in the NCR and for designinginservice education for food safety extension educators

  1. DISCO analysis: A nonparametric extension of analysis of variance

    OpenAIRE

    RIZZO, MARIA L.; Székely, Gábor J.

    2010-01-01

    In classical analysis of variance, dispersion is measured by considering squared distances of sample elements from the sample mean. We consider a measure of dispersion for univariate or multivariate response based on all pairwise distances between-sample elements, and derive an analogous distance components (DISCO) decomposition for powers of distance in $(0,2]$. The ANOVA F statistic is obtained when the index (exponent) is 2. For each index in $(0,2)$, this decomposition determines a nonpar...

  2. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  3. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    Directory of Open Access Journals (Sweden)

    Aitman T

    2008-11-01

    Full Text Available Abstract Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System is a multi-user rich internet application (RIA providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data

  4. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  5. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  6. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  7. From sensor networks to connected analysis tools

    Science.gov (United States)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  8. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  9. Tools for Search, Analysis and Management of Patent Portfolios

    Directory of Open Access Journals (Sweden)

    Muqbil Burhan,

    2012-05-01

    Full Text Available Patents have been acknowledged worldwide as rich sources of information for technology forecasting,competitive analysis and management of patent portfolios. Because of the high potential of patents as animportant indicator of various technology measurements and as econometric measure, patent analysis hasbecome vital for corporate world and of interest to academic research. Retrieving relevant prior art, concerningthe technology of interest, has been vital for managers and consultants dealing with intellectual propertyrights. Tremendous progress in the field of electronic search tools as of late has led to a specialised and lesstime consuming search capabilities even in the fields where search is mostly based on formulas, drawingsand flowcharts. Online patent databases and various other analytical tools have given patent analysis animportant edge, which otherwise required extensive and time consuming data collection and calculations.Patents provide valuable information which could be used for various purposes by industry, academia, andpolicy analysts. This article explores the various options and tools available for patent search, analysis andmanagement of patent portfolios, for efficiently identifying the relevant prior art, managing their own patentclusters and/or competitive intelligence.

  10. Extensive use of sown meadows – a tool for restoration of botanical diversity

    OpenAIRE

    Sendžikaitė, Jūratė; Pakalnis, Romas

    2006-01-01

    The state of sown meadow communities of different intensity of management (intensive and extensive using) over the period of 10–14 years of running was evaluated at Graisupis Experimental Field Station, Lithuania. Comparison of study data on intensively and extensively used sown meadows enabled to ascertain that intensity of sown meadows succession depends upon the character of grassland management. The positive correlation between the number of vascular plant species and sown meadow age reve...

  11. A Divergence Statistics Extension to VTK for Performance Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  12. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... are specified using the temporal logic Probabilistic Computation Tree Logic (PCTL) and we employ stochastic model checking, by means of the model checker PRISM, to compute their exact values. We present a simplified example of a distributed stochastic system where we determine a reachability property...

  13. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  14. A tool for subjective analysis of TTOs

    OpenAIRE

    Resende, David Nunes; Gibson, David V.; Jarrett, James

    2011-01-01

    The objective of this article is to present a proposal (working paper) for a quantitative analysis tool to help technology transfer offices (TTOs) improve their structures, processes and procedures. Our research started from the study of internal practices and structures that facilitate the interaction between R&D institutions, their TTOs and regional surroundings. We wanted to identify “bottlenecks” in those processes, procedures, and structures. We mapped the bottlenecks in a set of “...

  15. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard;

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  16. SWOT Analysis of Extension Systems in Southern African Countries

    Directory of Open Access Journals (Sweden)

    Oladimeji Idowu Oladele

    2011-11-01

    Full Text Available This paper examined the strengths, weaknesses, opportunities and threats to extension systems in selected southern African countries of Malawi, Zambia, Swaziland, Mozambique, Lesotho and Botswana. This is predicated on the need for improved performance and reinvigoration of extension system for better services. Some of the strengths are development works to improve rural areas, extensive grassroots coverage, and use of committees for research and extension linkages, involvement of NGOs and private sector, and effective setting of extension administration units. On the other hand opportunities that can be explored are donor will fund well designed programme, expansion in the use of ICT, high involvement of farmers in extension planning, and potential for effective programme implementation. The threats to the extension systems are attempts to privatize extension services, weak feedback to research, and donor fatigue. The paper recommends that extension administrators, and policy makers should pay proper attention to the strengths, weaknesses, opportunities and threats to extension systems with a view of making extension services truly more responsive to local concerns and policy.

  17. Animal Agriculture in a Changing Climate Online Course: An Effective Tool for Creating Extension Competency

    Science.gov (United States)

    Whitefield, Elizabeth; Schmidt, David; Witt-Swanson, Lindsay; Smith, David; Pronto, Jennifer; Knox, Pam; Powers, Crystal

    2016-01-01

    There is a need to create competency among Extension professionals on the topic of climate change adaptation and mitigation in animal agriculture. The Animal Agriculture in a Changing Climate online course provides an easily accessible, user-friendly, free, and interactive experience for learning science-based information on a national and…

  18. Extensive exometabolome analysis reveals extended overflow metabolism in various microorganisms

    Directory of Open Access Journals (Sweden)

    Paczia Nicole

    2012-09-01

    Full Text Available Abstract Overflow metabolism is well known for yeast, bacteria and mammalian cells. It typically occurs under glucose excess conditions and is characterized by excretions of by-products such as ethanol, acetate or lactate. This phenomenon, also denoted the short-term Crabtree effect, has been extensively studied over the past few decades, however, its basic regulatory mechanism and functional role in metabolism is still unknown. Here we present a comprehensive quantitative and time-dependent analysis of the exometabolome of Escherichia coli, Corynebacterium glutamicum, Bacillus licheniformis, and Saccharomyces cerevisiae during well-controlled bioreactor cultivations. Most surprisingly, in all cases a great diversity of central metabolic intermediates and amino acids is found in the culture medium with extracellular concentrations varying in the micromolar range. Different hypotheses for these observations are formulated and experimentally tested. As a result, the intermediates in the culture medium during batch growth must originate from passive or active transportation due to a new phenomenon termed “extended” overflow metabolism. Moreover, we provide broad evidence that this could be a common feature of all microorganism species when cultivated under conditions of carbon excess and non-inhibited carbon uptake. In turn, this finding has consequences for metabolite balancing and, particularly, for intracellular metabolite quantification and 13C-metabolic flux analysis.

  19. Extension of an Object Oriented Multidisciplinary Analysis Optimization (MDAO) Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Multidisciplinary design, analysis, and optimization (MDAO) tools today possess limited disciplines with little fidelity modeling capability. These tools are...

  20. AXARM: An Extensible Remote Assistance and Monitoring Tool for ND Telerehabilitation

    Science.gov (United States)

    Bueno, Antonio; Marzo, Jose L.; Vallejo, Xavier

    AXARM is a multimedia tool for rehabilitation specialists that allow remote assistance and monitoring of patients activities. This tool is the evolution of the work done in 2005-06 between the BCDS research group of UdG and the Multiple Sclerosis Foundation (FEM in Spanish) in Girona under the TRiEM project. Multiple Sclerosis (MS) is a neurodegenerative disease (ND) that can provoke significant exhaustion in patients even just by going to the medical centre for rehabilitation or regular checking visits. The tool presented in this paper allows the medical staff to remotely carry on patient consults and activities from their home, minimizing the displacements to medical consulting. AXARM has a hybrid P2P architecture and consists essentially of a cross-platform videoconference system, with audio/video recording capabilities. The system can easily be extended to include new capabilities like, among others, asynchronous activities whose result can later be analyzed by the medical personnel.

  1. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  2. Microfracturing and new tools improve formation analysis

    Energy Technology Data Exchange (ETDEWEB)

    McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))

    1992-12-07

    This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

  3. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    International Nuclear Information System (INIS)

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  4. Extension of ship accident analysis to multiple-package shipments

    International Nuclear Information System (INIS)

    Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously (Spring, 1995). Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10's or 100's of individual packagings is compromised. The previous analysis (Spring, 1995) involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship's bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well-characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence. (authors)

  5. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  6. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  7. A Discrete Event Simulator for Extensive Defense Mechanism for Denial of Service Attacks Analysis

    Directory of Open Access Journals (Sweden)

    Maryam Tanha

    2012-01-01

    Full Text Available Problem statement: Seeking for defense mechanisms against low rate Denial of Service (DoS attacks as a new generation of DoS attacks has received special attention during recent years. As a decisive factor, evaluating the performance of the offered mitigation techniques based on different metrics for determining the viability and ability of these countermeasures requires more research. Approach: The development of a new generalized discrete event simulator has been deliberated in detail. The research conducted places high emphasis on the benefits of creating a customized discrete event simulator for the analysis of security and in particular the DoS attacks. The simulator possesses a niche in terms of the small scale, low execution time, portability and ease of use. The attributes and mechanism of the developed simulator is complemented with the proposed framework. Results: The simulator has been extensively evaluated and has proven to provide an ideal tool for the analysis and exploration of DoS attacks. In-depth analysis is enabled by this simulator for creating multitudes of defense mechanisms against HTTP low rate DoS attacks. The acquired results from the simulation tool have been compared against a simulator from the same domain. Subsequently, it enables the validation of developed simulator utilizing selected performance metrics including mean in-system time, average delay and average buffer size. Conclusion: The proposed simulator serves as an efficient and scalable performance analysis tool for the analysis of HTTP low rate DoS attack defense mechanism. Future work can encompass the development of discrete event simulators for analysis of other security issues such as Intrusion Detection Systems.

  8. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  9. Availability analysis and design of storage extension based on CWDM

    Science.gov (United States)

    Qin, Leihua; Yu, Yan

    2007-11-01

    As Fibre Channel becomes the key storage protocol of SAN (Storage Area Network), enterprises are increasingly deploying FC SANs in their data central. Meanwhile, organizations increasingly face an enormous influx of data that must be stored, protected, backed up and replicated for mitigating the risk of losing data. One of the best ways to achieve this goal is to deploy SAN extension based on CWDM(Coarse Wavelength Division Multiplexing). Availability is one of the key performance metrics for business continuity and disaster recovery and has to be well understood by IT departments when deploying SAN extension based on CWDM, for it determines accessibility to remotely located data sites. In this paper, several architecture of storage extension over CWDM is analyzed and the availability of this different storage extension architecture are calculated. Further more, two kinds of high availability storage extension architecture with 1:1 or 1:N protection is designed, and the availability of protection schema storage extension based on CWDM is calculated too.

  10. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  11. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  14. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP)

  15. ISHM Decision Analysis Tool: Operations Concept

    Science.gov (United States)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  16. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  17. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  18. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  19. A review of ADM1 extensions, applications, and analysis 2002-2005

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Keller, J.; Steyer, J.-P.

    2006-01-01

    Since publication of the Scientific and Technical Report (STR) describing the ADM1, the model has been extensively used, and analysed in both academic and practical applications. Adoption of the ADM1 in popular systems analysis tools such as the new wastewater benchmark (BSM2), and its use as a...... for wastewater sludge digestion. One criticism of note is that the ADM1 stoichiometry focuses on catabolism rather than anabolism. This means that inorganic carbon can be used unrealistically as a carbon source during some anabolic reactions. Advances and novel applications have also been made in the...... present issue, which focuses on the ADM1. These papers also explore a number of novel areas not originally envisaged in this review....

  20. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....... properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system...

  1. Condition analysis and operating lifetime extension concepts for wind turbines

    International Nuclear Information System (INIS)

    In Germany the basis for the expansion of wind energy was already laid at the beginning of the 1990s. Hence, the first wind turbines already started to reach the end of their permitted lifetime. At that time as today the different wind turbine types were engineered for an operational lifetime of 20 years. As reliable wind turbines types were already available in the 1990s, it is technically and commercially reasonable to consider the extension of their operational lifetime. Of particular interest is the lifetime extension of wind turbine types installed in the beginning of the 2000s. During that period many wind turbine types were launched which absolutely correspond to state-of-the-art technology.

  2. Spatial analysis of extension fracture systems: A process modeling approach

    Science.gov (United States)

    Ferguson, C.C.

    1985-01-01

    Little consensus exists on how best to analyze natural fracture spacings and their sequences. Field measurements and analyses published in geotechnical literature imply fracture processes radically different from those assumed by theoretical structural geologists. The approach adopted in this paper recognizes that disruption of rock layers by layer-parallel extension results in two spacing distributions, one representing layer-fragment lengths and another separation distances between fragments. These two distributions and their sequences reflect mechanics and history of fracture and separation. Such distributions and sequences, represented by a 2 ?? n matrix of lengthsL, can be analyzed using a method that is history sensitive and which yields also a scalar estimate of bulk extension, e (L). The method is illustrated by a series of Monte Carlo experiments representing a variety of fracture-and-separation processes, each with distinct implications for extension history. Resulting distributions of e (L)are process-specific, suggesting that the inverse problem of deducing fracture-and-separation history from final structure may be tractable. ?? 1985 Plenum Publishing Corporation.

  3. Tool Gear: Infrastructure for Parallel Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  4. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  5. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  6. Weeds: a CLASS extension for the analysis of millimeter and sub-millimeter spectral surveys

    CERN Document Server

    Maret, S; Pety, J; Bardeau, S; Reynier, E

    2010-01-01

    The advent of large instantaneous bandwidth receivers and high spectral resolution spectrometers on (sub-)millimeter telescopes has opened up the possibilities for unbiased spectral surveys. Because of the large amount of data they contain, any analysis of these surveys requires dedicated software tools. Here we present an extension of the widely used CLASS software that we developed to that purpose. This extension, named Weeds, allows for searches in atomic and molecular lines databases (e.g. JPL or CDMS) that may be accessed over the internet using a virtual observatory (VO) compliant protocol. The package permits a quick navigation across a spectral survey to search for lines of a given species. Weeds is also capable of modeling a spectrum, as often needed for line identification. We expect that Weeds will be useful for analyzing and interpreting the spectral surveys that will be done with the HIFI instrument on board Herschel, but also observations carried-out with ground based millimeter and sub-millimet...

  7. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  8. Extensions of positive definite functions applications and their harmonic analysis

    CERN Document Server

    Jorgensen, Palle; Tian, Feng

    2016-01-01

    This monograph deals with the mathematics of extending given partial data-sets obtained from experiments; Experimentalists frequently gather spectral data when the observed data is limited, e.g., by the precision of instruments; or by other limiting external factors. Here the limited information is a restriction, and the extensions take the form of full positive definite function on some prescribed group. It is therefore both an art and a science to produce solid conclusions from restricted or limited data. While the theory of is important in many areas of pure and applied mathematics, it is difficult for students and for the novice to the field, to find accessible presentations which cover all relevant points of view, as well as stressing common ideas and interconnections. We have aimed at filling this gap, and we have stressed hands-on-examples.

  9. Detection and analysis of radio pulses from extensive air showers

    International Nuclear Information System (INIS)

    Radio pulses from extensive air showers(EAS) at 30, 44 and 60 MHz frequencies have been studied, using wide band broad-side arrays of half-wave dipole antenna systems. The experimental results support the theoretical prediction that the field strength of radioemission depends on the shower size. An asymmetry has been noticed in the pulse height distributions of radio pulses detected by North-South and East-West directed arrays. These observations are in agreement with the theory that the charge separation mechanism is predominant in generating radio pulses from EAS and radio emission is polarised in the East-West direction. Experimental data are compared with those of earlier workers. (author)

  10. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  11. Multidimensional Analysis: A Management Tool for Monitoring HIPAA Compliance and Departmental Performance

    OpenAIRE

    Coleman, Robert M.; Ralston, Matthew D.; Szafran, Alexander; Beaulieu, David M.

    2004-01-01

    Most RIS and PACS systems include extensive auditing capabilities as part of their security model, but inspecting those audit logs to obtain useful information can be a daunting task. Manual analysis of audit trails, though cumbersome, is often resorted to because of the difficulty to construct queries to extract complex information from the audit logs. The approach proposed by the authors uses standard off-the-shelf multidimensional analysis software tools to assist the PACS/RIS administrato...

  12. Extensible Data Set Architecture for Systems Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The process of aircraft design requires the integration of data from individual analysis of aerodynamic, structural, thermal, and behavioral properties of a flight...

  13. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  14. Interactive Graphics Tools for Analysis of MOLA and Other Data

    Science.gov (United States)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  15. Link between extension, dyking and subsidence as the reconstruction tool of intraplate rifting mechanism (backstripping data, modelling and geochronology)

    Science.gov (United States)

    Polyansky, Oleg P.; Reverdatto, Vladimir V.; Babichev, Alexey V.

    2014-05-01

    Correlation between subsidence and extension-related magmatism is key in determining mechanism of intracratonic sedimentary basins formation. The total volume of basic sheet intrusions and volcanics within sedimentary rock mass characterizes indirectly the degree of depletion and thinning of the rifted mantle lithosphere. At present the documented features of real-world intracontinental basins show a wide range of parameters characterizing the duration and rate of subsidence, degree of extension/thinning of the lithosphere, age and extent of dyking. For creation of general model of continental rifting it is important to reconstruct an evolution of basins finished at the continental stage, not entered an oceanic spreading phase. One of examples of such structure is the Vilyui sedimentary basin in the eastern Siberian Platform which includes the massive emplacements (10**5 km3) of extrusive and intrusive rocks of the Vilyui large igneous province. We combine backstripping reconstructions of sedimentation and thermal regime during the subsidence with a numerical modelling based on the deformable solid mechanics. It is the first time that the evolution of sedimentation and subsidence which is nonuniform over the basin area has been analyzed for the Vilyui basin. The rift origin of the basin is proved. We estimate the spatial distribution of the parameters of crustal and mantle-lithosphere extension as well as expansion due to dike intrusions. According to the reconstructions, the type of subsidence curves for the sedimentary rocks of the basin depends on the tectonic regime of sedimentation in individual subbasins. The backstripping analysis revealed two stages of extension (sediments 4-5 km thick) and a foreland stage (sediments >2 km thick). With the two-layered lithosphere model, we concluded that the subcrustal layer underwent predominant extension (by a factor of 1.2-2.0 vs. 1.1-1.4 in the crust). In each section, dyke-related extension due to basic intrusion is

  16. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . An example of a prototype for a digital conceptual design tool with integrated real time structural analysis is presented and compared with a more common Building Information Modelling (BIM) approach. It is concluded that a digital conceptual design tool with embedded real time structural analysis...

  17. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri;

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  18. A Multidimensional Analysis Tool for Visualizing Online Interactions

    Science.gov (United States)

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  19. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  20. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  1. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  2. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    OpenAIRE

    Christiane Moser; Alina Krischkowsky; Katja Neureiter; Manfred Tscheligi

    2015-01-01

    Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adul...

  3. Preliminary analysis of knee stress in Full Extension Landing

    Directory of Open Access Journals (Sweden)

    Majid Davoodi Makinejad

    2013-09-01

    Full Text Available OBJECTIVE: This study provides an experimental and finite element analysis of knee-joint structure during extended-knee landing based on the extracted impact force, and it numerically identifies the contact pressure, stress distribution and possibility of bone-to-bone contact when a subject lands from a safe height. METHODS: The impact time and loads were measured via inverse dynamic analysis of free landing without knee flexion from three different heights (25, 50 and 75 cm, using five subjects with an average body mass index of 18.8. Three-dimensional data were developed from computed tomography scans and were reprocessed with modeling software before being imported and analyzed by finite element analysis software. The whole leg was considered to be a fixed middle-hinged structure, while impact loads were applied to the femur in an upward direction. RESULTS: Straight landing exerted an enormous amount of pressure on the knee joint as a result of the body's inability to utilize the lower extremity muscles, thereby maximizing the threat of injury when the load exceeds the height-safety threshold. CONCLUSIONS: The researchers conclude that extended-knee landing results in serious deformation of the meniscus and cartilage and increases the risk of bone-to-bone contact and serious knee injury when the load exceeds the threshold safety height. This risk is considerably greater than the risk of injury associated with walking downhill or flexion landing activities.

  4. The Application and Extension of Backward Software Analysis

    CERN Document Server

    Perisic, Aleksandar

    2010-01-01

    The backward software analysis is a method that emanates from executing a program backwards - instead of taking input data and following the execution path, we start from output data and by executing the program backwards command by command, analyze data that could lead to the current output. The changed perspective forces a developer to think in a new way about the program. It can be applied as a thorough procedure or casual method. With this method, we have many advantages in testing, algorithm and system analysis. For example, in testing the advantage is obvious if the set of output data is smaller than possible inputs. For some programs or algorithms, we know more precisely the output data, so this backward analysis can help in reducing the number of test cases or even in strict verification of an algorithm. The difficulty lies in the fact that we need types of data that no programming language currently supports, so we need additional effort to understand how this method works, or what effort we need to ...

  5. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  6. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L.; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-01

    Summary Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. PMID:26766446

  7. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  8. Using Visual Tools for Analysis and Learning

    OpenAIRE

    Burton, Rob; Barlow, Nichola; Barker, Caroline

    2010-01-01

    This pack is intended as a resource for lecturers and students to facilitate the further development of their learning and teaching strategies. Visual tools were initially introduced within a module of the Year 3 nursing curriculum within the University of Huddersfield by Dr Rob Burton. Throughout the period of 2007-2008 a small team of lecturers with a keen interest in this teaching and learning strategy engaged in exploring and reviewing the literature. They also attended a series of loc...

  9. Tools for Physics Analysis in CMS

    International Nuclear Information System (INIS)

    The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling the development of common analysis efforts across and within physics analysis groups. It aims at fulfilling the needs of most CMS analyses, providing both ease-of-use for the beginner and flexibility for the advanced user. The main PAT concepts are described in detail and some examples from realistic physics analyses are given.

  10. KIT multi-physics tools for the analysis of design and beyond design basis accidents of light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Victor Hugo; Miassoedov, Alexei; Steinbrueck, M.; Tromm, W. [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany)

    2016-05-15

    This paper describes the KIT numerical simulation tools under extension and validation for the analysis of design and beyond design basis accidents (DBA) of Light Water Reactors (LWR). The description of the complex thermal hydraulic, neutron kinetics and chemo-physical phenomena going on during off-normal conditions requires the development of multi-physics and multi-scale simulations tools which are fostered by the rapid increase in computer power nowadays. The KIT numerical tools for DBA and beyond DBA are validated using experimental data of KIT or from abroad. The developments, extensions, coupling approaches and validation work performed at KIT are shortly outlined and discussed in this paper.

  11. KIT multi-physics tools for the analysis of design and beyond design basis accidents of light water reactors

    International Nuclear Information System (INIS)

    This paper describes the KIT numerical simulation tools under extension and validation for the analysis of design and beyond design basis accidents (DBA) of Light Water Reactors (LWR). The description of the complex thermal hydraulic, neutron kinetics and chemo-physical phenomena going on during off-normal conditions requires the development of multi-physics and multi-scale simulations tools which are fostered by the rapid increase in computer power nowadays. The KIT numerical tools for DBA and beyond DBA are validated using experimental data of KIT or from abroad. The developments, extensions, coupling approaches and validation work performed at KIT are shortly outlined and discussed in this paper.

  12. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  13. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  14. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    Science.gov (United States)

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. PMID:23892044

  15. Total life cycle management - assessment tool an exploratory analysis

    OpenAIRE

    Young, Brad de

    2008-01-01

    It is essential for the Marine Corps to ensure the successful supply, movement and maintenance of an armed force in peacetime and combat. Integral to an effective, long-term logistics plan is the ability to accurately forecast future requirements to sustain materiel readiness. Total Life Cycle Management Assessment Tool (TLCM-AT) is a simulation tool combining operations, maintenance, and logistics. This exploratory analysis gives insight into the factors used by TLCM-AT beyond the tool s emb...

  16. A Tool Set for the Genome-Wide Analysis of Neurospora crassa by RT-PCR

    OpenAIRE

    HURLEY, JENNIFER M.; Dasgupta, Arko; Andrews, Peter; Crowell, Alexander M.; Ringelberg, Carol; Loros, Jennifer J.; Dunlap, Jay C

    2015-01-01

    Neurospora crassa is an important model organism for filamentous fungi as well as for circadian biology and photobiology. Although the community-accumulated tool set for the molecular analysis of Neurospora is extensive, two components are missing: (1) dependable reference genes whose level of expression are relatively constant across light/dark cycles and as a function of time of day and (2) a catalog of primers specifically designed for real-time PCR (RT-PCR). To address the first of these ...

  17. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system

    International Nuclear Information System (INIS)

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. -- Highlights: •Urban extensive green roof systems have a synergetic effect in mitigating urban runoff. •These systems are improve runoff mitigation and decentralized urban water management. •These systems have a high water-retaining capacity response to rainfall of less than 20 mm/h. •The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52. -- Extensive green-roofs are an effective storm water best-management practice and the proposed parameters can be applied to mitigate urban runoff

  18. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  19. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  20. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  1. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...... model checking, model-checking tools, symbolic model checking, visual tools, and verification of critical systems....

  2. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  3. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  4. Multivariate and 2D Extensions of Singular Spectrum Analysis with the Rssa Package

    Directory of Open Access Journals (Sweden)

    Nina Golyandina

    2015-10-01

    Full Text Available Implementation of multivariate and 2D extensions of singular spectrum analysis (SSA by means of the R package Rssa is considered. The extensions include MSSA for simultaneous analysis and forecasting of several time series and 2D-SSA for analysis of digital images. A new extension of 2D-SSA analysis called shaped 2D-SSA is introduced for analysis of images of arbitrary shape, not necessary rectangular. It is shown that implementation of shaped 2D-SSA can serve as a basis for implementation of MSSA and other generalizations. Efficient implementation of operations with Hankel and Hankel-block-Hankel matrices through the fast Fourier transform is suggested. Examples with code fragments in R, which explain the methodology and demonstrate the proper use of Rssa, are presented.

  5. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  6. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... and internal dynamics independently. The 2-dimensional horizontal motions of both ships are taken into account. in the horizontal plane. Structural deformation for both the striking and the struck ship is evaluated independently using rigid-plastic simplified analysis procedure. The dDeveloped tool...

  7. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... and internal dynamics independently. The 2-dimensional horizontal motions of both ships are taken into account. in the horizontal plane. Structural deformation for both the striking and the struck ship is evaluated independently using rigid-plastic simplified analysis procedure. The dDeveloped tool...

  8. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  9. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  10. Complete analysis of extensions of D(n)1 permutation orbifolds

    International Nuclear Information System (INIS)

    We give the full set of S matrices for extensions of D(n)1 permutation orbifolds, extending our previous work to the yet unknown case of integer spin spinor currents. The main tool is triality of SO(8). We also provide fixed point resolution matrices for spinor currents of D(n)1 permutation orbifolds with n even and not multiple of four, where the spinor currents have half-integer spin.

  11. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a......Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... complicated endeavor. In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete...

  12. Ethical objections against including life-extension costs in cost-effectiveness analysis: a consistent approach.

    Science.gov (United States)

    Gandjour, Afschin; Müller, Dirk

    2014-10-01

    One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness. PMID:25027546

  13. A 3D image analysis tool for SPECT imaging

    Science.gov (United States)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  14. Risk analysis tools for force protection and infrastructure/asset protection

    International Nuclear Information System (INIS)

    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework

  15. Creating an anthropomorphic digital MR phantom—an extensible tool for comparing and evaluating quantitative imaging algorithms

    Science.gov (United States)

    Bosca, Ryan J.; Jackson, Edward F.

    2016-01-01

    Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.

  16. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  17. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  18. Development of microfluidic tools for cell analysis

    Czech Academy of Sciences Publication Activity Database

    Václavek, Tomáš; Křenková, Jana; Foret, František

    Brno: Ústav analytické chemie AV ČR, v. v. i, 2015 - (Foret, F.; Křenková, J.; Drobníková, I.; Klepárník, K.), s. 209-211 ISBN 978-80-904959-3-7. [CECE 2015. International Interdisciplinary Meeting on Bioanalysis /12./. Brno (CZ), 21.09.2015-23.09.2015] R&D Projects: GA ČR(CZ) GBP206/12/G014; GA ČR(CZ) GA14-06319S Institutional support: RVO:68081715 Keywords : microfluidic device * 3D- printing * single cell analysis Subject RIV: CB - Analytical Chemistry, Separation http://www.ce-ce.org/CECE2015/CECE%202015%20proceedings_full.pdf

  19. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    , as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area......The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  20. Development of data analysis tool for combat system integration

    Science.gov (United States)

    Shin, Seung-Chun; Shin, Jong-Gye; Oh, Dae-Kyun

    2013-03-01

    System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT) for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  1. Application of Multivariate Analysis Tools to Industrial Scale Fermentation Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregård, Rasmus; Stocks, Stuart M.;

    The analysis of batch process data can provide insight into the process operation, and there is a vast amount of historical data available for data mining. Empirical modelling utilising this data is desirable where there is a lack of understanding regarding the underlying process (Formenti et al...... concentration (Nomikos and MacGregor 1995). Multivariate analysis is a powerful tool for investigating large data sets by identification of trends in the data. However, there are also challenges associated with the application of multivariate analysis tools to batch process data. This is due to issues related...... application of multivariate methods to industrial scale process data to cover these considerations....

  2. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  3. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  4. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  5. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  6. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  7. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  8. An Integrated Traverse Planner and Analysis Tool for Planetary Exploration

    OpenAIRE

    Johnson, Aaron William; Hoffman, Jeffrey A.; Newman, Dava; Mazarico, Erwan Matias; Zuber, Maria

    2010-01-01

    Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. The Massachusetts Institute of Technology is currently developing such a system, called the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT). The goal of this system is twofold: to allow for realistic simulations of traverses in order to assist with har...

  9. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  10. A multi-criteria decision analysis tool to support electricity

    OpenAIRE

    Ribeiro, Fernando; Ferreira, Paula Varandas; Araújo, Maria Madalena Teixeira de

    2012-01-01

    A Multi-Criteria Decision Analysis (MCDA) tool was designed to support the evaluation of different electricity production scenarios. The MCDA tool is implemented in Excel worksheet and uses information obtained from a mixed integer optimization model. Given the input, the MCDA allowed ranking different scenarios relying on their performance on 13 criteria covering economic, job market, quality of life of local populations, technical and environmental issues. The criteria were weighte...

  11. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  12. Non-extensive diffusion entropy analysis: non-stationarity in teen birth phenomena

    OpenAIRE

    Scafetta, N.; Grigolini, P.; Hamilton, P; West, B. J.

    2002-01-01

    A complex process is often a balance between non-stationary and stationary components. We show how the non-extensive Tsallis q-entropy indicator may be interpreted as a measure of non-stationarity in time series. This is done by applying the non-extensive entropy formalism to the Diffusion Entropy Analysis (DEA). We apply the analysis to the study of the teen birth phenomenon. We find that the unmarried teen births are strongly influenced by social processes with memory. This memory is relate...

  13. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  14. Tool Gear: Infrastructure for Building Parallel Programming Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J M; Gyllenhaal, J

    2002-12-09

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  15. Problems Impacting Extension Program Quality at the County Level: Results from an Analysis of County Program Reviews Conducted in Florida

    Science.gov (United States)

    Harder, Amy; Moore, Austen; Mazurkewicz, Melissa; Benge, Matt

    2013-01-01

    Needs assessments are an important tool for informing organizational development efforts in Extension. The purpose of the study reported here was to identify problems faced by county units within UF/IFAS Extension during county program reviews. The findings were drawn from the reports created after five county units experienced program reviews in…

  16. ITERA: IDL Tool for Emission-line Ratio Analysis

    CERN Document Server

    Groves, Brent

    2010-01-01

    We present a new software tool to enable astronomers to easily compare observations of emission line ratios with those determined by photoionization and shock models, ITERA, the IDL Tool for Emission-line Ratio Analysis. This tool can plot ratios of emission lines predicted by models and allows for comparison of observed line ratios against grids of these models selected from model libraries associated with the tool. We provide details of the libraries of standard photoionization and shock models available with ITERA, and, in addition, present three example emission line ratio diagrams covering a range of wavelengths to demonstrate the capabilities of ITERA. ITERA, and associated libraries, is available from \\url{http://www.brentgroves.net/itera.html}

  17. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from......Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  18. Analysis Level Of Utilization Information And Communication Technology With The Competency Level Of Extension Workers

    Directory of Open Access Journals (Sweden)

    Veronice Veronice

    2015-01-01

    Full Text Available Extension placed man as the subject of development and human capital to develop into independent and empowered (dignity in adapting to the environment, thus being able to improve the quality of life for themselves, their families and communities. It is therefore necessary professional competence standard extension clear and effective controls in carrying counseling profession domination supported by Information and Communication Technology (ICT. This research aimed to analyze the relationship between the level of competency with the level of ICT use by the extension workers. The study was designed as a descriptive survey research correlational ,which was observed by quantitative analysis approach that is supported by descriptive and inferential statistic analysis. The study was conducted in Bogor Regency,West Java Province. Based on this research can be concluded  level of ICT utilization in the range of aspects related resources are very real to the competence of extension on the capability of understanding the potential of the region, entrepreneurial ability and the ability of the system guides the network, while the variation of the material aspects of counseling and a variety of related information is very real with all levels of competence extension.

  19. GEPAS, a web-based tool for microarray data analysis and interpretation

    Science.gov (United States)

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  20. Extensive risk analysis of mechanical failure for an epiphyseal hip prothesis: a combined numerical-experimental approach.

    Science.gov (United States)

    Martelli, S; Taddei, F; Cristofolini, L; Gill, H S; Viceconti, M

    2011-02-01

    There has been recent renewed interest in proximal femur epiphyseal replacement as an alternative to conventional total hip replacement. In many branches of engineering, risk analysis has proved to be an efficient tool for avoiding premature failures of innovative devices. An extensive risk analysis procedure has been developed for epiphyseal hip prostheses and the predictions of this method have been compared to the known clinical outcomes of a well-established contemporary design, namely hip resurfacing devices. Clinical scenarios leading to revision (i.e. loosening, neck fracture and failure of the prosthetic component) were associated with potential failure modes (i.e. overload, fatigue, wear, fibrotic tissue differentiation and bone remodelling). Driving parameters of the corresponding failure mode were identified together with their safe thresholds. For each failure mode, a failure criterion was identified and studied under the most relevant physiological loading conditions. All failure modes were investigated with the most suitable investigation tool, either numerical or experimental. Results showed a low risk for each failure scenario either in the immediate postoperative period or in the long term. These findings are in agreement with those reported by the majority of clinical studies for correctly implanted devices. Although further work is needed to confirm the predictions of this method, it was concluded that the proposed risk analysis procedure has the potential to increase the efficacy of preclinical validation protocols for new epiphyseal replacement devices. PMID:21428147

  1. Economics of Tobacco Toolkit, Tool 2. Data for Economic Analysis

    OpenAIRE

    Czart, Christina; Chaloupka, Frank

    2013-01-01

    This tool provides a general introduction to 'the art' of building databases. It addresses a number of issues pertaining to the search, identification and preparation of data for meaningful economic analysis. It can best be thought of as a reference mechanism that provides support for the occasionally frustrated but endlessly hungry researcher working through the adventures of tobacco cont...

  2. Shape Analysis for Complex Systems Using Information Geometry Tools.

    OpenAIRE

    Sanctis, Angela De

    2012-01-01

    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  3. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  4. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  5. A Spreadsheet Teaching Tool For Analysis Of Pipe Networks

    OpenAIRE

    El Bahrawy, Aly N.

    1997-01-01

    Spreadsheets are used widely in engineering to perform several analysis and design calculations. They are also very attractive as educational tools due to their flexibility and efficiency. This paper demonstrates the use of spreadsheets in teaching the analysis of water pipe networks, which involves the calculation of pipe flows or nodal heads given the network layout, pipe characteristics (diameter, length, and roughness), in addition to external flows. The network performance is better und...

  6. DFTCalc: a tool for efficient fault tree analysis (extended version)

    OpenAIRE

    Arnold, Florian; Belinfante, Axel; Berg, de, MT Mark; Guck, Dennis; Stoelinga, Mariëlle

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ana...

  7. DFTCalc: a tool for efficient fault tree analysis

    OpenAIRE

    Arnold F.; Belinfante A.; Van Der Berg F.; Guck D.; Stoelinga M.

    2013-01-01

    Effective risk management is a key to ensure that our nuclear power plants, medical equipment, and power grids are dependable; and it is often required by law. Fault Tree Analysis (FTA) is a widely used methodology here, computing important dependability measures like system reliability. This paper presents DFTCalc, a powerful tool for FTA, providing (1) efficient fault tree modelling via compact representations; (2) effective analysis, allowing a wide range of dependability properties to be ...

  8. json2run: a tool for experiment design & analysis

    OpenAIRE

    Urli, Tommaso

    2013-01-01

    json2run is a tool to automate the running, storage and analysis of experiments. The main advantage of json2run is that it allows to describe a set of experiments concisely as a JSON-formatted parameter tree. It also supports parallel execution of experiments, automatic parameter tuning through the F-Race framework and storage and analysis of experiments with MongoDB and R.

  9. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  10. Discovery and New Frontiers Project Budget Analysis Tool

    Science.gov (United States)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  11. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  12. Factors that Influence the Perceived Advantages and Relevance of Facebook as a Learning Tool: An Extension of the UTAUT

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Carvajal-Trujillo, Elena; Monge-Lozano, Pedro

    2014-01-01

    Social media technologies are becoming a fundamental component of education. This study extends the Unified Theory of Acceptance and Use of Technology (UTAUT) to identify factors that influence the perceived advantages and relevance of Facebook as a learning tool. The proposed model is based on previous models of UTAUT. Constructs from previous…

  13. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  14. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  15. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  16. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  17. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  18. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  19. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm . PMID:27248514

  20. RASTA: A generalized tool for radiation source term analysis

    International Nuclear Information System (INIS)

    A FORTRAN computer code has been written for generalized radiation source term preparation. The RASTA (Radiation Source Term Analysis) code calculates the neutron and photon sources for any input isotopic combination and collapses to a user-selected multigroup format. The code is very easy to use, requiring minimal input. It provides extensive output edits suitable for data analysis or direct input into radiation transport codes. RASTA runs on the SRS RS6000 workstation cluster, but it should be easily portable to other computers

  1. Virtual tool mark generation for efficient striation analysis.

    Science.gov (United States)

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

  2. Virtual Tool Mark Generation for Efficient Striation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  3. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  4. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  5. On-line Tools for Solar Data Compiled at the Debrecen Observatory and Their Extensions with the Greenwich Sunspot Data

    Science.gov (United States)

    Baranyi, T.; Győri, L.; Ludmány, A.

    2016-08-01

    The primary task of the Debrecen Heliophysical Observatory (DHO) has been the most detailed, reliable, and precise documentation of the solar photospheric activity since 1958. This long-term effort resulted in various solar catalogs based on ground-based and space-borne observations. A series of sunspot databases and on-line tools were compiled at DHO: the Debrecen Photoheliographic Data (DPD, 1974 -), the dataset based on the Michelson Doppler Imager (MDI) of the Solar and Heliospheric Observatory (SOHO) called SOHO/MDI-Debrecen Data (SDD, 1996 - 2010), and the dataset based on the Helioseismic and Magnetic Imager (HMI) of the Solar Dynamics Observatory (SDO) called SDO/HMI-Debrecen Data (HMIDD, 2010 - ). User-friendly web-presentations and on-line tools were developed to visualize and search data. As a last step of the compilation, the revised version of Greenwich Photoheliographic Results (GPR, 1874 - 1976) catalog was converted to DPD format, and a homogeneous sunspot database covering more than 140 years was created. The database of images for the GPR era was completed with the full-disc drawings of the Hungarian historical observatories Ógyalla and Kalocsa (1872 - 1919) and with the polarity drawings of Mount Wilson Observatory. We describe the main characteristics of the available data and on-line tools.

  6. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  7. Multi-Spacecraft Analysis with Generic Visualization Tools

    Science.gov (United States)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  8. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  9. Extensible Markup Language (XML) based analysis and comparison of heterogeneous databases

    OpenAIRE

    Halle, Robert F.

    2001-01-01

    This thesis describes an Extensible Markup Language (XML) based analysis and comparison method that could be used to identity equivalent components of heterogeneous databases. In the Department of Defense there currently exist multiple databases required to support command and control of some portion of the battlefield force. Interoperability between forces will become crucial as the force structure continues to be reduced. This interoperability will be facilitated through the integration of ...

  10. Mixed-signal power system emulator extension to solve unbalanced fault transient stability analysis

    OpenAIRE

    Lanz, Guillaume; Kyriakidis, Theodoras; Cherkaoui, Rachid; Kayal, Maher

    2014-01-01

    This paper presents the extension of a platform originally devoted to symmetrical transient stability analysis, into the domain of unbalanced faults. The aim of this solver is to increase the speed of dynamic stability assessment for power systems. It is based on an analog representation of the grid alongside dedicated digital resources for the simulation of the models of power network components. Using the symmetrical components theory, this platform can be adapted to handle unsymmetrical di...

  11. Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory

    CERN Document Server

    Sena, I

    2012-01-01

    A systematic analysis of transverse momentum distribution of hadrons produced in ultra-relativistic $p+p$ and $A+A$ collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.

  12. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  13. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  14. Validating and Verifying a New Thermal-Hydraulic Analysis Tool

    International Nuclear Information System (INIS)

    The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3DC/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created using RELAP5-3DC/ATHENA. Both steady-state and transient calculations can be performed, using many working fluids and point to three-dimensional neutronics. A general description of the techniques used to couple the codes is given. The validation and verification (V and V) matrix is outlined. V and V is presently ongoing. (authors)

  15. Space mission scenario development and performance analysis tool

    Science.gov (United States)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  16. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    Directory of Open Access Journals (Sweden)

    Chun-Hsiao Wu

    2016-05-01

    Full Text Available In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter, the target data sets, and appropriate social sensors for analysis. By adopting parameter templates, one can quickly apply the experience of other experts at the beginning of a new case or even create one’s own templates. We have also modularized the analysis tools into two social sensors: Language Sensor and Text Sensor. A user evaluation was conducted and the results showed that usefulness, modularity, reusability, and manageability of the system were all very positive. The results also show that this tool can greatly reduce the time needed to perform data analysis, solve the problems encountered in traditional analysis process, and obtained useful results. The experimental results reveal that the concept of social sensor and the proposed system design are useful for big data analysis of social media.

  17. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  18. COMPARISON OF MALAYSIA MANUFACTURING COMPANIES BY FINANCIAL STATEMENT ANALYSIS TOOLS

    OpenAIRE

    MALEK, Afagh; Mohammadi, Maryam; NASSIRI, Fardokht

    2012-01-01

    One of the best ways to get the expected results from trading in the stock market is to acquire a good evaluation of companies’ performance. Similarly, this study aims at comparing the financial performance of Lb Aluminium Berhad and Seal Incorporated Berhad manufacturing companies, which are listed in the main market of Malaysian stock exchange. The data were gathered from the annual reports of companies during last three years and analysed by financial statement analysis tools, which are ...

  19. Keel A Data Mining Tool: Analysis With Genetic

    Directory of Open Access Journals (Sweden)

    Ms. Pooja Mittal

    2012-06-01

    Full Text Available This work is related to the KEEL (Knowledge Extraction basedon Evolutionary Learning tool, an open source software thatsupports data management and provides a platform for theanalysis of evolutionary learning for Data Mining problems ofdifferent kinds including as regression, classification,unsupervised learning. It includes a big collection of evolutionarylearning algorithms based on different approaches: Pittsburgh,Michigan. It empowers the user to perform complete analysis ofany genetic fuzzy system in comparison to existing ones, with astatistical test module for comparison.

  20. Analysis of assessment tools used in engineering degree programs

    OpenAIRE

    Martínez Martínez, María del Rosario; Olmedo Torre, Noelia; Amante García, Beatriz; Farrerons Vidal, Óscar; Cadenato Matia, Ana María

    2014-01-01

    This work presents an analysis of the assessment tools used by professors at the Universitat Politécnica de Catalunya to assess the generic competencies introduced in the Bachelor’s Degrees in Engineering. In order to conduct this study, a survey was designed and administered anonymously to a sample of the professors most receptive to educational innovation at their own university. All total, 80 professors responded to this survey, of whom 26% turned out to be members of the un...

  1. Ethics Auditing and Conflict Analysis as Management Tools

    OpenAIRE

    Anu Virovere; Merle Rihma

    2008-01-01

    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  2. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  3. Validation of retrofit analysis simulation tool: Lessons learned

    OpenAIRE

    Trcka, Marija; Pasini, Jose Miguel; Oggianu, Stella Maris

    2014-01-01

    It is well known that residential and commercial buildings account for about 40% of the overall energy consumed in the United States, and about the same percentage of CO2 emissions. Retrofitting existing old buildings, which account for 99% of the building stock, represents the best opportunity of achieving challenging energy and emission targets. United Technologies Research Center (UTC) has developed a methodology and tool that provides computational support for analysis and decision-making...

  4. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  5. Medical image analysis of 3D CT images based on extensions of Haralick texture features

    Czech Academy of Sciences Publication Activity Database

    Tesař, Ludvík; Shimizu, A.; Smutek, D.; Kobatake, H.; Nawano, S.

    2008-01-01

    Roč. 32, č. 6 (2008), s. 513-520. ISSN 0895-6111 R&D Projects: GA AV ČR 1ET101050403; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : image segmentation * Gaussian mixture model * 3D image analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.192, year: 2008 http://library.utia.cas.cz/separaty/2008/AS/tesar-medical image analysis of 3d ct image s based on extensions of haralick texture features.pdf

  6. Extension of the four-parameter logistic model for ELISA to multianalyte analysis.

    Science.gov (United States)

    Jones, G; Wortberg, M; Kreissig, S B; Bunch, D S; Gee, S J; Hammock, B D; Rocke, D M

    1994-12-28

    The standard implementation of enzyme-linked immunosorbent assay (ELISA) for single analytes can lead to false conclusions if cross reacting compounds are present in the sample. This paper discusses the extension of the usual four-parameter logistic model for ELISA to the case of multiple cross-reacting analytes. The use of the extended model in multianalyte analysis (MELISA) is illustrated and compared with a more simplistic approach. Data on the analysis of a binary mixture of s-triazines suggests the superiority of the proposed model. This model is also suitable for other forms of immunoassay that use the four-parameter logistic curve. PMID:7822815

  7. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  8. On-line Tools for Solar Data Compiled in the Debrecen Observatory and their Extensions with the Greenwich Sunspot Data

    CERN Document Server

    Baranyi, T; Ludmán, A

    2016-01-01

    The primary task of the Debrecen Heliophysical Observatory (DHO) has been the most detailed, reliable, and precise documentation of the solar photospheric activity since 1958. This long-term effort resulted in various solar catalogs based on ground-based and space-borne observations. A series of sunspot databases and on-line tools were compiled at DHO: the Debrecen Photoheliographic Data (DPD, 1974--), the dataset based on the {\\it Michelson Doppler Imager} (MDI) of the {\\it Solar and Heliospheric Observatory} (SOHO) called SOHO/MDI--Debrecen Data (SDD, 1996--2010), and the dataset based on the {\\it Helioseismic and Magnetic Imager} (HMI) of the {\\it Solar Dynamics Observatory} (SDO) called SDO/HMI--Debrecen Data (HMIDD, 2010--). User-friendly web-presentations and on-line tools were developed to visualize and search data. As a last step of compilation, the revised version of Greenwich Photoheliographic Results (GPR, 1874--1976) catalog was converted to DPD format, and a homogeneous sunspot database covering ...

  9. TTCScope - A scope-based TTC analysis tool

    CERN Document Server

    Moosavi, P

    2013-01-01

    This document describes a scope-based analysis tool for the TTC system. The software, ttcscope, was designed to sample the encoded TTC signal from a TTCex module with the aid of a LeCroy WaveRunner oscilloscope. From the sampled signal, the bunch crossing clock is recovered, along with the signal contents: level-1 accepts, TTC commands, and trigger types. Two use-cases are addressed: analysis of TTC signals and calibration of TTC crates. The latter includes calibration schemes for two signal phase shifts, one related to level-1 accepts, and the other to TTC commands.

  10. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  11. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  12. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  13. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  14. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  15. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  16. CSTACK: A Web-Based Stacking Analysis Tool for Deep/Wide Chandra Surveys

    Science.gov (United States)

    Miyaji, Takamitsu; Griffiths, R. E.; C-COSMOS Team

    2008-03-01

    Stacking analysis is a strong tool to probe the average X-ray properties of X-ray faint objects as a class, each of which are fainter than the detection limit as an individual source. This is especially the case for deep/wide surveys with Chandra, with its superb spatial resolution and the existence of survey data on the fields with extensive multiwavelength coverages. We present an easy-to use web-based tool (http://saturn.phys.cmu.edu/cstack), which enables users to perform a stacking analysis on a number of Chandra survey fields.Currently supported are C-COSMOS, Extended Chandra Deep Field South (proprietary access, password protected), Chandra Deep Fields South, and North (Guest access user=password=guest). For an input list of positions (e.g. galaxies selected from an optical catalog), the WWW tool returns stacked Chandra images in soft and hard bands and statistical analysis results including bootstrap histograms. We present running examples on the C-COSMOS data. The next version will also include the use of off-axis dependent aperture size, automatic exclusions of resolved sources, and histograms of stacks on random positions.

  17. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  18. VO-Dance an IVOA tools to easy publish data into VO and it's extension on planetology request

    Science.gov (United States)

    Smareglia, R.; Capria, M. T.; Molinaro, M.

    2012-09-01

    Data publishing through the self standing portals can be joined to VO resource publishing, i.e. astronomical resources deployed through VO compliant services. Since the IVOA (International Virtual Observatory Alliance) provides many protocols and standards for the various data flavors (images, spectra, catalogues … ), and since the data center has as a goal to grow up in number of hosted archives and services providing, the idea arose to find a way to easily deploy and maintain VO resources. VO-Dance is a java web application developed at IA2 that addresses this idea creating, in a dynamical way, VO resources out of database tables or views. It is structured to be potentially DBMS and platform independent and consists of 3 main tokens, an internal DB to store resources description and model metadata information, a restful web application to deploy the resources to the VO community. It's extension to planetology request is under study to best effort INAF software development and archive efficiency.

  19. Gemma: a Generic, Extensible and Modular Multi-Sensor Navigation Analysis System

    Science.gov (United States)

    Navarro, J. A.; Parés, M. E.; Colomina, I.

    2016-06-01

    This paper presents the concept of an architecture for a system that helps researchers in the field of Geomatics to speed up their daily research on kinematic geodesy, navigation and positioning fields. The presented ideas correspond to an extensible and modular software system aimed at the development of new navigation and positioning algorithms as well as at the evaluation of the performance of sensors. The concept, already implemented in the CTTC's system GEMMA is generic and extensible. This means that it is possible to incorporate new navigation algorithms or sensors at no maintenance cost. Only the effort related to the development tasks required to either create such algorithms or model sensors needs to be taken into account. As a consequence, change poses a much smaller problem for CTTC's research activities is this specific area. This system includes several standalone tools that may be combined in different ways to accomplish various goals; that is, it may be used to perform a variety of tasks, as, for instance, (1) define positioning and navigation scenarios, (2) simulate different kinds of sensors, (3) validate new navigation algorithms or (4) evaluate the quality of an estimated navigation solution.

  20. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  1. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  2. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  3. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  4. Evolution of the design methodologies for the next generation of RPV Extensive role of the thermal-hydraulics numerical tools

    International Nuclear Information System (INIS)

    The thermal-hydraulic design of the first PWR's was mainly based on an experimental approach, with a large series of test on the main equipment (control rod guide tubes, RPV plenums..), to check its performances. Development of CFD-codes and computers now allows for complex simulations of hydraulic phenomena. Provided adequate qualification, these numerical tools are efficient means to determine hydraulics in given design, and to perform sensitivities for optimization of new designs. Experiments always play their role, first for qualification, and for validation at the last stage of the design. The design of the European Pressurized water Reactor (EPR), is based on both hydraulic calculations and experiments, handled in a complementary approach. This paper describes the effort launched by Framatome-ANP on hydraulic calculations for the Reactor Pressure Vessel (RPV) of the EPR reactor. It concerns 3D-calculations of RPV-inlet including cold legs, RPV-downcomer and lower plenum, RPV-upper plenum up to and including hot legs. It covers normal operating conditions, but also accidental conditions as PTS (Pressurized Thermal Shock) in small break loss of coolant accident (SB-LOCA). Those hydraulic studies have provided numerous useful information for the mechanical design of RPV-internals. (authors)

  5. Evolution of the design methodologies for the next generation of RPV : the extensive role of the thermal hydraulics numerical tools

    International Nuclear Information System (INIS)

    The thermal-hydraulic design of the first PWR's was mainly based on an experimental approach, with a large series of test on the main equipment (control rod guide tubes, RPV plenums...), to check its performances. Development of CFD-codes and computers now allows for complex simulations of hydraulic phenomena. Provided adequate qualification, these numerical tools are efficient means to determine hydraulics in given design, and to perform sensitivities for optimization of new design. Experiments always play their role, on qualification at first, on validation of resulting design at end. The design of the European Pressurized water Reactor (EPR), jointly developed by FRAMATOME-ANP and EDFGU (German Utilities), is based on both hydraulic calculations and experiments, handled in a complementary approach. This paper describes the collective effort launched by FRAMATOME-ANP and EDF, on hydraulic calculations for the Reactor Pressure Vessel (RPV) of the EPR reactor. It concerns 3D-calculations of RPV-inlet including cold legs, RPV-downcomer and lower plenum, RPV-upper plenum up to and including hot legs. It covers normal operating conditions, but also accidental conditions as PTS (Pressurized Thermal Shock) in small break loss of coolant accident (SB-LOCA). Those hydraulic studies have provided numerous useful information for the mechanical design of RPV-internals

  6. Using the Virginia Cooperative Extension Climate Analysis Web Tool to Monitor, Predict, and Manage Corn Development

    OpenAIRE

    Thomason, Wade Everett; Alley, Marcus M., 1947-; Phillips, Steven B.; Parrish, David J. (David Joseph), 1943-; Raymond, F. D.

    2009-01-01

    Discusses effects of weather conditions, and in particular, temperature, on the growth and development of corn plants. Also discusses using this internet site to help make decisions about mid-season management of corn fields.

  7. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  8. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  9. Analysis and processing tools for nuclear trade related data

    International Nuclear Information System (INIS)

    This paper describes the development of a system used by the Nuclear Trade Analysis Unit of the Department of Safeguards for handling, processing, analyzing, reporting and storing nuclear trade related data. The data handling and analysis part of the system is already functional, but several additional features are being added to optimize its use. The aim is to develop the system in a manner that actively contributes to the management of the Department's overall knowledge and supports the departmental State evaluation process. Much of the data originates from primary sources and comes in many different formats and languages. It also comes with diverse security needs. The design of the system has to meet the special challenges set by the large volume and different types of data that needs to be handled in a secure and reliable environment. Data is stored in a form appropriate for access and analysis in both structured and unstructured formats. The structured data is entered into a database (knowledge base) called the Procurement Tracking System (PTS). PTS allows effective linking, visualization and analysis of new data with that already included in the system. The unstructured data is stored in text searchable folders (information base) equipped with indexing and search capabilities. Several other tools are linked to the system including a visual analysis tool for structured information and a system for visualizing unstructured data. All of which are designed to help the analyst locate the specific information required amongst a myriad of unrelated information. This paper describes the system's concept, design and evolution - highlighting its special features and capabilities, which include the need to standardize the data collection, entry and analysis processes. All this enables the analyst to approach tasks consistently and in a manner that both enhances teamwork and leads to the development of an institutional memory related to cover trade activities that can be

  10. Reticulate evolutionary history and extensive introgression in mosquito species revealed by phylogenetic network analysis.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Hahn, Matthew W; Nakhleh, Luay

    2016-06-01

    The role of hybridization and subsequent introgression has been demonstrated in an increasing number of species. Recently, Fontaine et al. (Science, 347, 2015, 1258524) conducted a phylogenomic analysis of six members of the Anopheles gambiae species complex. Their analysis revealed a reticulate evolutionary history and pointed to extensive introgression on all four autosomal arms. The study further highlighted the complex evolutionary signals that the co-occurrence of incomplete lineage sorting (ILS) and introgression can give rise to in phylogenomic analyses. While tree-based methodologies were used in the study, phylogenetic networks provide a more natural model to capture reticulate evolutionary histories. In this work, we reanalyse the Anopheles data using a recently devised framework that combines the multispecies coalescent with phylogenetic networks. This framework allows us to capture ILS and introgression simultaneously, and forms the basis for statistical methods for inferring reticulate evolutionary histories. The new analysis reveals a phylogenetic network with multiple hybridization events, some of which differ from those reported in the original study. To elucidate the extent and patterns of introgression across the genome, we devise a new method that quantifies the use of reticulation branches in the phylogenetic network by each genomic region. Applying the method to the mosquito data set reveals the evolutionary history of all the chromosomes. This study highlights the utility of 'network thinking' and the new insights it can uncover, in particular in phylogenomic analyses of large data sets with extensive gene tree incongruence. PMID:26808290

  11. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis. PMID:20075480

  12. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  13. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  15. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  16. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  17. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  18. Model extension and improvement for simulator-based software safety analysis

    International Nuclear Information System (INIS)

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study

  19. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U;

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...... the quantitative phylogenetic and functional compositions of metagenomes, to compare compositions of multiple metagenomes and to produce intuitive visual representations of such analyses. AVAILABILITY: SmashCommunity is freely available at http://www.bork.embl.de/software/smash CONTACT: bork@embl.de....

  20. A new bioinformatics analysis tools framework at EMBL–EBI

    OpenAIRE

    Goujon, Mickael; McWilliam, Hamish; Li, Weizhong; Valentin, Franck; Squizzato, Silvano; Paern, Juri; Lopez, Rodrigo

    2010-01-01

    The EMBL-EBI provides access to various mainstream sequence analysis applications. These include sequence similarity search services such as BLAST, FASTA, InterProScan and multiple sequence alignment tools such as ClustalW, T-Coffee and MUSCLE. Through the sequence similarity search services, the users can search mainstream sequence databases such as EMBL-Bank and UniProt, and more than 2000 completed genomes and proteomes. We present here a new framework aimed at both novice as well as exper...

  1. Modeling energy technology choices. Which investment analysis tools are appropriate?

    International Nuclear Information System (INIS)

    A variety of tools from modern investment theory appear to hold promise for unraveling observed energy technology investment behavior that often appears anomalous when analyzed using traditional investment analysis methods. This paper reviews the assumptions and important insights of the investment theories most commonly suggested as candidates for explaining the apparent ''energy technology investment paradox''. The applicability of each theory is considered in the light of important aspects of energy technology investment problems, such as sunk costs, uncertainty and imperfect information. The theories addressed include the capital asset pricing model, the arbitrage pricing theory, and the theory of irreversible investment. Enhanced net present value methods are also considered. (author)

  2. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  3. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  4. Introducing GIS to TransNav and its Extensive Maritime Application: An Innovative Tool for Intelligent Decision Making?

    Directory of Open Access Journals (Sweden)

    Angelica M. Baylon

    2013-12-01

    Full Text Available This paper aims to introduce GIS, its definition, principle, application in any discipline particularly maritime, its process, data sets and features and its benefits to maritime and universities. Specifically, the paper intends to provide an overview of its wide applications in maritime including but not limited to marine transportation, marine environment, port management and operation, maritime education and training (MET and maritime research. GIS simplest task is in mapping and visualization. But its most important function is in spatial analysis. Spatial analysis takes into account the location, geometry, topology, and relationships of geographic data, which lend itself to intelligent decision making. GIS is not just for researchers and students. GIS is especially useful for decision makers such as: managers, administrators, and directors of large and small projects. Scenarios are “seen” and analyzed even before events happen. To planners and decision makers, this is very important because they can assess the impact of events or scenario and may save a lot of time, effort, and money before implementing the actual project. An additional skill on GIS when learned or thought would certainly result to a technically competent maritime global workforce. The paper would provide ideas on possible areas for collaborations among TransNav member institutions for data sharing which may be processed and analyzed by a GIS specialist.

  5. Analysis of the extension and reduction method of qualified life used in equipment with ambiental qualification

    International Nuclear Information System (INIS)

    With the purpose of reducing costs of acquisition, maintenance, design, hour-man, dose, production, etc. or by changes in the temperature of service, diverse Nuclear Plants (NP) in the world have been carried out extensions and reductions of life of equipment and/or components related with the safety. The used methods are mainly tests type on equipment or component aged quickly, tests type on equipment or component aged naturally and use of the Arrhenius model with monitoring of temperatures in place. The present article carries out an analysis of the Arrhenius model with monitoring of temperatures in place, which is presented in two variants, equation (1) and (2), since it is the but used by the NP with extension purposes and reduction of qualified life. As back information, it was made one search of investigations, applied to diverse fields of the science that try about the reliability of the results provided by Arrhenius for the aging case. The results of the analysis indicate: I) that this method make uncertainties in some intervals of temperature and 2) which of the two variants it provides data more reliable. (Author)

  6. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  7. The Lagrangian analysis tool LAGRANTO - version 2.0

    Science.gov (United States)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  8. The LAGRANTO Lagrangian analysis tool - version 2.0

    Science.gov (United States)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  9. Topological Tools For The Analysis Of Active Region Filament Stability

    Science.gov (United States)

    DeLuca, Edward E.; Savcheva, A.; van Ballegooijen, A.; Pariat, E.; Aulanier, G.; Su, Y.

    2012-05-01

    The combination of accurate NLFFF models and high resolution MHD simulations allows us to study the changes in stability of an active region filament before a CME. Our analysis strongly supports the following sequence of events leading up to the CME: first there is a build up of magnetic flux in the filament through flux cancellation beneath a developing flux rope; as the flux rope develops a hyperbolic flux tube (HFT) forms beneath the flux rope; reconnection across the HFT raises the flux rope while adding addition flux to it; the eruption is triggered when the flux rope becomes torus-unstable. The work applies topological analysis tools that have been developed over the past decade and points the way for future work on the critical problem of CME initiation in solar active regions. We will present the uses of this approach, current limitations and future prospects.

  10. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  11. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  12. An interactive tool for the analysis of nuclear emulsion data

    International Nuclear Information System (INIS)

    In this paper, we present an interactive tool developed to analyze and display events recorded in nuclear emulsion experiments. Besides providing an excellent way to complement automatic emulsion scanning techniques this program implements a new approach for the complete three-dimensional (3D) analysis of the events. The program uses information recorded during automatic emulsion scanning procedure and commonly called 'video images'. The main features implemented are: automatic full track reconstruction up to wide angles (∼0.8 rad) with efficiency greater than 95%, vertex finding, 'kink' finding, two-dimensional 'microscope' visualization, 3D full view display and analysis. Performances were evaluated using a large set of video images containing manually checked neutrino interaction events and preliminary test beam exposure data

  13. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  14. PyRAT (python radiography analysis tool): overview

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  15. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  16. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  17. A complete analysis of FCNC and CP constraints in general SUSY extensions of the standard model

    CERN Document Server

    Gabbiani, F; Masiero, A; Silvestrini, L

    1996-01-01

    We analyze the full set of constraints on gluino- and photino-mediated SUSY contributions to FCNC and CP violating phenomena. We use the mass insertion method, hence providing a model-independent parameterization which can be readily applied in testing extensions of the MSSM. In addition to clarifying controversial points in the literature, we provide a more exhaustive analysis of the CP constraints, in particular concerning \\varepsilon^\\prime/\\varepsilon. As physically meaningful applications of our analysis, we study the implications in SUSY-GUT's and effective supergravities with flavour non-universality. This allows us to detail the domain of applicability and the correct procedure of implementation of the FC mass insertion approach.

  18. A tool for finite element deflection analysis of wings

    Energy Technology Data Exchange (ETDEWEB)

    Carlen, Ingemar

    2005-03-01

    A first version (ver 0.1) of a new tool for finite element deflection analysis of wind turbine blades is presented. The software is called SOLDE (SOLid blaDE), and was developed as a Matlab shell around the free finite element codes CGX (GraphiX - pre-processor), and CCX (CrunchiX - solver). In the present report a brief description of SOLDE is given, followed by a basic users guide. The main features of SOLDE are: - Deflection analysis of wind turbine blades, including 3D effects and warping. - Accurate prediction of eigenmodes and eigenfrequencies. - Derivation of 2-node slender elements for use in various aeroelastic analyses. The main differences between SOLDE and other similar tools can be summarised as: - SOLDE was developed without a graphical user interface or a traditional text file input deck. Instead the input is organised as Matlab data structures that have to be formed by a user provided pre-processor. - SOLDE uses a solid representation of the geometry instead of a thin shell approximation. The benefit is that the bending-torsion couplings will automatically be correctly captured. However, a drawback with the current version is that the equivalent orthotropic shell idealisation violates the local bending characteristics, which makes the model useless for buckling analyses. - SOLDE includes the free finite element solver CCX, and thus no expensive commercial software (e.g. Ansys, or Nastran) is required to produce results.

  19. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  20. An analysis of farm services centre (fsc) approach launched for agricultural extension in NWFP, pakistan

    International Nuclear Information System (INIS)

    Agricultural extension services have a pivotal role in agricultural and rural development. It is the major source of technology dissemination and helps the farmers to rationalize the use of natural resources for a sustainable agricultural development. Globally, public-private partnership approach in Agricultural Extension is considered more effective, efficient, and responsive to different categories of farmers. In Pakistan, government of North West Frontier Province (NWFP) has initiated a public-private partnership Extension Programme in the province. This is locally called as Farm Services Centre (FSC). This approach has the inbuilt mechanism of inputs delivery, market facilitation, exchange of experiences and diffusion of knowledge and technology. However, the extent to which this public-private partnership is instrumental in achieving aforementioned objectives is yet to be established. The present study was an attempt to analyze this public-private partnership approach by measuring its strengths and weaknesses. For this purpose, out of 24 districts of NWFP, two districts namely Swabi and Lakimarwat were selected randomly. From these two districts, 491 FSC's member farmers were selected as respondents for interview on random basis. The analysis showed that the most prominent strength of FSC was farmers empowerment with mean 4.05 and SD 1.29, while that of Agriculture Extension Department (AED) was effective message delivery. As per respondents, the major weakness of both (FSC and AED) systems was no marketing facility with mean 4.12 and 4.13 and SD 1.22 and 1.01 respectively. It is essential that the government should ensure the mandated activities at FSC forum particularly the facilitation by line agencies and NWFP Agricultural University, Peshawar. It should be a forum of technology dissemination, agricultural surplus produce marketing and cooperative farming. Agricultural Extension Department should provide more facilities to the staff indulged in FSC

  1. Dynamic event tree analysis as a risk management tool

    International Nuclear Information System (INIS)

    This work demonstrates the use of dynamic event tree (DET) methods in the development of optimal severe accident management strategies. The ADAPT (Analysis of Dynamic Accident Progression Trees) software is a tool which can generate dynamic event trees (DETs) using a system model and a user-specified set of branching rules. DETs explicitly account for time in the system analysis and can more accurately model stochastic processes which can occur in a system compared to traditional event tree analysis. The kinds of information which can be extracted from a DET analysis (specifically, an ADAPT analysis) towards risk management and the advantages of such an approach are illustrated using a sodium fast reactor (SFR). The scenario studied is that of an aircraft crash which eliminates most reactor vessel auxiliary cooling system (RVACS) heat removal capability (three of four RVACS towers). Probability distributions on worker response time were developed and used as branching rules. In order to demonstrate the abilities of dynamic methodologies in the context of severe accident mitigation strategies, several recovery strategies are considered. Probability distributions of fuel temperature and tower recovery times are generated to determine which strategy resulted in the lowest overall risk. (authors)

  2. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    Science.gov (United States)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of

  3. Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.

    Science.gov (United States)

    Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S

    2009-02-01

    A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment. PMID:19004576

  4. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  5. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  6. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  7. PSIM: A TOOL FOR ANALYSIS OF DEVICE PAIRING METHODS

    Directory of Open Access Journals (Sweden)

    Yasir Arfat Malkani

    2009-10-01

    Full Text Available Wireless networks are a common place nowadays and almost all of the modern devices support wireless communication in some form. These networks differ from more traditional computing systems due tothe ad-hoc and spontaneous nature of interactions among devices. These systems are prone to security risks, such as eavesdropping and require different techniques as compared to traditional securitymechanisms. Recently, secure device pairing in wireless environments has got substantial attention from many researchers. As a result, a significant set of techniques and protocols have been proposed to deal with this issue. Some of these techniques consider devices equipped with infrared, laser, ultrasound transceivers or 802.11 network interface cards; while others require embedded accelerometers, cameras and/or LEDs, displays, microphones and/or speakers. However, many of the proposed techniques or protocols have not been implemented at all; while others are implemented and evaluated in a stand-alone manner without being compared with other related work [1]. We believe that it is because of the lack of specialized tools that provide a common platform to test thepairing methods. As a consequence, we designed such a tool. In this paper, we are presenting design and development of the Pairing Simulator (PSim that can be used to perform the analysis of devicepairing methods.

  8. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    Science.gov (United States)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  9. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  10. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    International Nuclear Information System (INIS)

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered

  11. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  12. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  13. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods and...... tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  14. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  15. Results of fractal analysis of the Kiel extensive air shower data

    International Nuclear Information System (INIS)

    For years there has been a problem in cosmic ray studies of how to distinguish individual extensive air showers (EAS) originating from primary protons, heavy nuclei or primary photons. In this paper results of experimental data obtained from the fractal analysis of particle density distributions in individual EAS detected in the range of shower sizes Ne between 1.4x105-5x106 by the old Kiel experiment are presented. The Lipschitz-Hoelder exponent distributions of EAS detected by the Kiel experiment are discussed. The examples of EAS most probably originating from primary protons, heavy nuclei and high-energy gamma-rays are presented. The lateral distributions of charged particle densities at small distances, angular and size spectra and the mass composition of primary cosmic ray particles around the 'knee' of the energy spectrum are discussed. The Monte Carlo simulation data illustrating the problem of interest are also shown. (author)

  16. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Jeremy [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  17. Tools for the analysis and characterization of therapeutic protein species

    Directory of Open Access Journals (Sweden)

    Fuh MM

    2016-05-01

    Full Text Available Marceline Manka Fuh, Pascal Steffen, Hartmut Schlüter Mass Spectrometric Proteomics, Institute for Clinical Chemistry and Laboratory Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany Abstract: A continuously increasing number of therapeutic proteins are being released into the market, including biosimilars. In contrast to small organic drugs, therapeutic proteins require an extensive analysis of their exact chemical composition because of their complexity and proof of the absence of contaminants, such as host cell proteins and nucleic acids. Especially challenging is the detection of low abundant species of therapeutic proteins because these species are usually very similar to the target therapeutic protein. However, the detection of these species is very important for the safety of patients because a very small change of the exact chemical composition may cause serious side effects. In this review, we give a brief overview about the most important analytical approaches for characterizing therapeutic protein species and their contaminants and focus on the progress in this field during the past 3 years. Top-down mass spectrometry of intact therapeutic proteins in the future may solve many of the current problems in their analysis. Keywords: therapeutic protein species, biosimilars, liquid chromatography, mass spectrometry, capillary electrophoresis

  18. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  19. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  20. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  1. Extension of explicit formulas in Poissonian white noise analysis using harmonic analysis on configuration spaces

    Directory of Open Access Journals (Sweden)

    Yu.G.Kondratiev

    2008-02-01

    Full Text Available Harmonic analysis on configuration spaces is used in order to extend explicit expressions for the images of creation, annihilation, and second quantization operators in L2-spaces with respect to Poisson point processes to a set of functions larger than the space obtained by directly using chaos expansion. This permits, in particular, to derive an explicit expression for the generator of the second quantization of a sub-Markovian contraction semigroup on a set of functions which forms a core of the generator.

  2. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  3. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  4. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    OpenAIRE

    Drechsel Marion; Wilkening Stefan; Chen Bowang; Hemminki Kari

    2009-01-01

    Abstract Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools p...

  5. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  6. Abstract Interfaces for Data Analysis —Component Architecture for Data Analysis Tools

    Institute of Scientific and Technical Information of China (English)

    G.Barrand; P.Binko; 等

    2001-01-01

    The fast turnover of software technologies,in particular in the domain of in teractivity(covering user interface and visualisation)makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete.At the HepVis '99 workshop,a working group has been formed to improve the rpoduction of software tools for data analysis in HENP.Beside promoting a distributed development organisation,one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques.An initial domain analysis has come up with several categories(componets)found in typical data analysis tools:historams,Ntuples,Functions,Vectors,Fitter,Plotter,Analyzer and Controller,Special Emphasis was put on reducing the couplings between the categories to a minimum,thus optimising re-use and maintainability of any component individually.The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++(Anaphe/Lizard,Openscientist)and Java(Java Analysis Studio),A special implementation aims at accessing the Java Liraries(through their Abstract Interfaces)from C++.This paper giver an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  7. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  8. Thermal buckling comparative analysis using Different FE (Finite Element) tools

    Energy Technology Data Exchange (ETDEWEB)

    Banasiak, Waldemar; Labouriau, Pedro [INTECSEA do Brasil, Rio de Janeiro, RJ (Brazil); Burnett, Christopher [INTECSEA UK, Surrey (United Kingdom); Falepin, Hendrik [Fugro Engineers SA/NV, Brussels (Belgium)

    2009-12-19

    High operational temperature and pressure in offshore pipelines may lead to unexpected lateral movements, sometimes call lateral buckling, which can have serious consequences for the integrity of the pipeline. The phenomenon of lateral buckling in offshore pipelines needs to be analysed in the design phase using FEM. The analysis should take into account many parameters, including operational temperature and pressure, fluid characteristic, seabed profile, soil parameters, coatings of the pipe, free spans etc. The buckling initiation force is sensitive to small changes of any initial geometric out-of-straightness, thus the modeling of the as-laid state of the pipeline is an important part of the design process. Recently some dedicated finite elements programs have been created making modeling of the offshore environment more convenient that has been the case with the use of general purpose finite element software. The present paper aims to compare thermal buckling analysis of sub sea pipeline performed using different finite elements tools, i.e. general purpose programs (ANSYS, ABAQUS) and dedicated software (SAGE Profile 3D) for a single pipeline resting on an the seabed. The analyses considered the pipeline resting on a flat seabed with a small levels of out-of straightness initiating the lateral buckling. The results show the quite good agreement of results of buckling in elastic range and in the conclusions next comparative analyses with sensitivity cases are recommended. (author)

  9. Spectral Analysis Tool 6.2 for Windows

    Science.gov (United States)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  10. Tool for Sizing Analysis of the Advanced Life Support System

    Science.gov (United States)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  11. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  12. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models the...... number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material is...

  13. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    Science.gov (United States)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed

  14. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    Science.gov (United States)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  15. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  16. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  17. Analysis tool for predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems

    International Nuclear Information System (INIS)

    An analysis tool is constructed for the purpose of predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems. The basic requirements of such an analysis tool are established, and documentation is presented for several fluid transient computer codes which were considered for the tool. The code modifications necessary to meet the analysis tool requirements are described. To test the computational capability of the analysis tool a verification problem is considered and the results reported. These results serve only to demonstrate the applicability of the analysis tool to this type of problem; the code has not been validated by comparison with experiment. Documentation is provided for a brief sensitivity study involving the sample problem. Additional analysis tool information, as well as detailed sample problem results are included in the form of appendices

  18. Easily extensible unix software for spectral analysis, display, modification, and synthesis of musical sounds

    Science.gov (United States)

    Beauchamp, James W.

    2002-11-01

    Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.

  19. Extending the XNAT archive tool for image and analysis management in ophthalmology research

    Science.gov (United States)

    Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.

    2013-03-01

    In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.

  20. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  1. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool,...

  2. Whole-Genome Sequencing Analysis of Serially Isolated Multi-Drug and Extensively Drug Resistant Mycobacterium tuberculosis from Thai Patients.

    Science.gov (United States)

    Faksri, Kiatichai; Tan, Jun Hao; Disratthakit, Areeya; Xia, Eryu; Prammananan, Therdsak; Suriyaphol, Prapat; Khor, Chiea Chuen; Teo, Yik-Ying; Ong, Rick Twee-Hee; Chaiprasert, Angkana

    2016-01-01

    Multi-drug and extensively drug-resistant tuberculosis (MDR and XDR-TB) are problems that threaten public health worldwide. Only some genetic markers associated with drug-resistant TB are known. Whole-genome sequencing (WGS) is a promising tool for distinguishing between re-infection and persistent infection in isolates taken at different times from a single patient, but has not yet been applied in MDR and XDR-TB. We aim to detect genetic markers associated with drug resistance and distinguish between reinfection and persistent infection from MDR and XDR-TB patients based on WGS analysis. Samples of Mycobacterium tuberculosis (n = 7), serially isolated from 2 MDR cases and 1 XDR-TB case, were retrieved from Siriraj Hospital, Bangkok. The WGS analysis used an Illumina Miseq sequencer. In cases of persistent infection, MDR-TB isolates differed at an average of 2 SNPs across the span of 2-9 months whereas in the case of reinfection, isolates differed at 61 SNPs across 2 years. Known genetic markers associated with resistance were detected from strains susceptible to streptomycin (2/7 isolates), p-aminosalicylic acid (3/7 isolates) and fluoroquinolone drugs. Among fluoroquinolone drugs, ofloxacin had the highest phenotype-genotype concordance (6/7 isolates), whereas gatifloxcain had the lowest (3/7 isolates). A putative candidate SNP in Rv2477c associated with kanamycin and amikacin resistance was suggested for further validation. WGS provided comprehensive results regarding molecular epidemiology, distinguishing between persistent infection and reinfection in M/XDR-TB and potentially can be used for detection of novel mutations associated with drug resistance. PMID:27518818

  3. TopCAT-Topographical Compartment Analysis Tool to analyze seacliff and beach change in GIS

    Science.gov (United States)

    Olsen, Michael J.; Young, Adam P.; Ashford, Scott A.

    2012-08-01

    This paper discusses the development of a new GIS extension named the Topographic Compartment Analysis Tool (TopCAT), which compares sequential digital elevation models (DEMs) and provides a quantitative and statistical analysis of the alongshore topographical change. TopCAT was specifically designed for the morphological analysis of seacliffs and beaches but may be applied to other elongated features which experience topographical change, such as stream beds, river banks, coastal dunes, etc. To demonstrate the capabilities of TopCAT two case studies are presented herein. The first case examines coastal cliff retreat for a 500 m section in Del Mar, California and shows that large failures comprised a large portion of the total eroded volume and the average retreat rate does not provide a good estimate of local maximum cliff retreat. The second case investigates the alongshore volumetric beach sand change caused by hurricane Bonnie (1998) for an 85 km section in the Cape Hatteras National Seashore, North Carolina. The results compare well (generally within 6%) with previous investigations. These case studies highlight additional information gained through performing a detailed, discretized analysis using TopCAT.

  4. Geomorphometric analysis of fine-scale morphology for extensive areas: a new surface-texture operator

    Science.gov (United States)

    Trevisani, Sebastiano; Rocca, Michele

    2014-05-01

    The application of geomorphometric analysis to high resolution digital terrain models (HRDTM) amplifies our capability to characterize and interpret fine-scale solid earth surface morphology. In this context it is possible to analyze fine-scale morphology in term of surface texture (e.g. Trevisani, 2012; Lucieer, 2005) and retrieve information linked to the different geomorphic processes and factors; this kind of analysis has an interesting potential to be exploited in the context of quantitative geomorphologic/geologic interpretation and geo-engineering. We developed a multiscale texture operator capable to synthetize the main characteristics of local surface texture in an efficient way. The proposed operator can be viewed as an hybrid between classical geostatistical spatial continuity indexes (e.g. variogram, Atkinson, 2000) and the well-known operator based on (rotation invariant) local binary patterns (Ojala, 2002). An important characteristic of the operator is to derive information on surface texture in an easily interpretable form so as to facilitate its use by experts for the interpretation of geomorphic processes and factors. Moreover this surface texture operator could be used for the derivation of more complex and ad-hoc surface texture indexes. We present the application of the operator in the analysis of different HRDTMs, mainly in the context of alpine environment. A particular interesting example is the application of the surface texture analysis in an extensive area (hundreds of km2), including also urbanized zones, and the evaluation of potential links between surface texture and lithological and geo-structural factors. References Atkinson, P.M. & Lewis, P. 2000, "Geostatistical classification for remote sensing: An introduction", Computers and Geosciences, vol. 26, no. 4, pp. 361-371. Lucieer, A., Stein, A., 2005. Texture-based landform segmentation of LiDAR imagery. International Journal of Applied Earth Observation and Geoinformation 6, 261

  5. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  6. A measuring tool for tree-rings analysis

    Science.gov (United States)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  7. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  8. A Current Review of the Meniscus Imaging: Proposition of a Useful Tool for Its Radiologic Analysis

    International Nuclear Information System (INIS)

    The main objective of this review was to present a synthesis of the current literature in order to provide a useful tool to clinician in radiologic analysis of the meniscus. All anatomical descriptions were clearly illustrated by MRI, arthroscopy, and/or drawings. The value of standard radiography is extremely limited for the assessment of meniscal injuries but may be indicated to obtain a differential diagnosis such as osteoarthritis. Ultrasound is rarely used as a diagnostic tool for meniscal pathologies and its accuracy is operator-dependent. CT arthrography with multiplanar reconstructions can detect meniscus tears that are not visible on MRI. This technique is also useful in case of MRI contraindications, in postoperative assessment of meniscal sutures and the condition of cartilage covering the articular surfaces. MRI is the most accurate and less invasive method for diagnosing meniscal lesions. MRI allows confirming and characterizing the meniscal lesion, the type, the extension, its association with a cyst, the meniscal extrusion, and assessing cartilage and subchondral bone. New 3D-MRI in three dimensions with isotropic resolution allows the creation of multiplanar reformatted images to obtain from an acquisition in one sectional plane reconstructions in other spatial planes. 3D MRI should further improve the diagnosis of meniscal tears

  9. A Current Review of the Meniscus Imaging: Proposition of a Useful Tool for Its Radiologic Analysis

    Directory of Open Access Journals (Sweden)

    Nicolas Lefevre

    2016-01-01

    Full Text Available The main objective of this review was to present a synthesis of the current literature in order to provide a useful tool to clinician in radiologic analysis of the meniscus. All anatomical descriptions were clearly illustrated by MRI, arthroscopy, and/or drawings. The value of standard radiography is extremely limited for the assessment of meniscal injuries but may be indicated to obtain a differential diagnosis such as osteoarthritis. Ultrasound is rarely used as a diagnostic tool for meniscal pathologies and its accuracy is operator-dependent. CT arthrography with multiplanar reconstructions can detect meniscus tears that are not visible on MRI. This technique is also useful in case of MRI contraindications, in postoperative assessment of meniscal sutures and the condition of cartilage covering the articular surfaces. MRI is the most accurate and less invasive method for diagnosing meniscal lesions. MRI allows confirming and characterizing the meniscal lesion, the type, the extension, its association with a cyst, the meniscal extrusion, and assessing cartilage and subchondral bone. New 3D-MRI in three dimensions with isotropic resolution allows the creation of multiplanar reformatted images to obtain from an acquisition in one sectional plane reconstructions in other spatial planes. 3D MRI should further improve the diagnosis of meniscal tears.

  10. A Current Review of the Meniscus Imaging: Proposition of a Useful Tool for Its Radiologic Analysis.

    Science.gov (United States)

    Lefevre, Nicolas; Naouri, Jean Francois; Herman, Serge; Gerometta, Antoine; Klouche, Shahnaz; Bohu, Yoann

    2016-01-01

    The main objective of this review was to present a synthesis of the current literature in order to provide a useful tool to clinician in radiologic analysis of the meniscus. All anatomical descriptions were clearly illustrated by MRI, arthroscopy, and/or drawings. The value of standard radiography is extremely limited for the assessment of meniscal injuries but may be indicated to obtain a differential diagnosis such as osteoarthritis. Ultrasound is rarely used as a diagnostic tool for meniscal pathologies and its accuracy is operator-dependent. CT arthrography with multiplanar reconstructions can detect meniscus tears that are not visible on MRI. This technique is also useful in case of MRI contraindications, in postoperative assessment of meniscal sutures and the condition of cartilage covering the articular surfaces. MRI is the most accurate and less invasive method for diagnosing meniscal lesions. MRI allows confirming and characterizing the meniscal lesion, the type, the extension, its association with a cyst, the meniscal extrusion, and assessing cartilage and subchondral bone. New 3D-MRI in three dimensions with isotropic resolution allows the creation of multiplanar reformatted images to obtain from an acquisition in one sectional plane reconstructions in other spatial planes. 3D MRI should further improve the diagnosis of meniscal tears. PMID:27057352

  11. Extensive differences in antifungal immune response in two Drosophila species revealed by comparative transcriptome analysis.

    Science.gov (United States)

    Seto, Yosuke; Tamura, Koichiro

    2013-01-01

    The innate immune system of Drosophila is activated by ingestion of microorganisms. D. melanogaster breeds on fruits fermented by Saccharomyces cerevisiae, whereas D. virilis breeds on slime flux and decaying bark of tree housing a variety of bacteria, yeasts, and molds. In this study, it is shown that D. virilis has a higher resistance to oral infection of a species of filamentous fungi belonging to the genus Penicillium compared to D. melanogaster. In response to the fungal infection, a transcriptome profile of immune-related genes was considerably different between D. melanogaster and D. virilis: the genes encoding antifungal peptides, Drosomycin and Metchnikowin, were highly expressed in D. melanogaster whereas, the genes encoding Diptericin and Defensin were highly expressed in D. virilis. On the other hand, the immune-induced molecule (IM) genes showed contrary expression patterns between the two species: they were induced by the fungal infection in D. melanogaster but tended to be suppressed in D. virilis. Our transcriptome analysis also showed newly predicted immune-related genes in D. virilis. These results suggest that the innate immune system has been extensively differentiated during the evolution of these Drosophila species. PMID:24151578

  12. Extensive Differences in Antifungal Immune Response in Two Drosophila Species Revealed by Comparative Transcriptome Analysis

    Directory of Open Access Journals (Sweden)

    Yosuke Seto

    2013-01-01

    Full Text Available The innate immune system of Drosophila is activated by ingestion of microorganisms. D. melanogaster breeds on fruits fermented by Saccharomyces cerevisiae, whereas D. virilis breeds on slime flux and decaying bark of tree housing a variety of bacteria, yeasts, and molds. In this study, it is shown that D. virilis has a higher resistance to oral infection of a species of filamentous fungi belonging to the genus Penicillium compared to D. melanogaster. In response to the fungal infection, a transcriptome profile of immune-related genes was considerably different between D. melanogaster and D. virilis: the genes encoding antifungal peptides, Drosomycin and Metchnikowin, were highly expressed in D. melanogaster whereas, the genes encoding Diptericin and Defensin were highly expressed in D. virilis. On the other hand, the immune-induced molecule (IM genes showed contrary expression patterns between the two species: they were induced by the fungal infection in D. melanogaster but tended to be suppressed in D. virilis. Our transcriptome analysis also showed newly predicted immune-related genes in D. virilis. These results suggest that the innate immune system has been extensively differentiated during the evolution of these Drosophila species.

  13. Project Final Report: Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James

    2014-01-06

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  14. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  15. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  16. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    Science.gov (United States)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  17. High Frequency Acoustic Response Characterization and Analysis of the Deep Throttling Common Extensible Cryogenic Engine

    Science.gov (United States)

    Casiano, M. J.

    2011-01-01

    The Common Extensive Cryogenic Engine program demonstrated the operation of a deep throttling engine design. The program, spanning five years from August 2005 to July 2010, funded testing through four separate engine demonstration test series. Along with successful completion of multiple objectives, a discrete response of approximately 4000 Hz was discovered and explored throughout the program. The typical low-amplitude acoustic response was evident in the chamber measurement through almost every operating condition; however, at certain off-nominal operating conditions, the response became discrete with higher amplitude. This paper summarizes the data reduction, characterization, and analysis of the 4,000 Hz response for the entire program duration, using the large amount of data collected. Upon first encountering the response, new objectives and instrumentation were incorporated in future test series to specifically collect 4,000 Hz data. The 4,000 Hz response was identified as being related to the first tangential acoustic mode by means of frequency estimation and spatial decomposition. The latter approach showed that the effective node line of the mode was aligned with the manifold propellant inlets with standing waves and quasi-standing waves present at various times. Contour maps that contain instantaneous frequency and amplitude trackings of the response were generated as a significant improvement to historical manual approaches of data reduction presentation. Signal analysis and dynamic data reduction also uncovered several other features of the response including a stable limit cycle, the progressive engagement of subsequent harmonics, the U-shaped time history, an intermittent response near the test-based neutral stability region, other acoustic modes, and indications of modulation with a separate subsynchronous response. Although no engine damage related to the acoustic mode was noted, the peak-to-peak fluctuating pressure amplitude achieved 12.1% of the

  18. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning; Mullins, Michael

    2006-01-01

    possible approaches for working with digital tectonics by means of acoustics: The architects, the architect-engineer or hybrid practitioner and finally a prototype for a possible digital tectonic tool. For the third approach in the case study a prototype digital tectonic tool is tested on the design...

  19. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this wor...

  20. SawjaCard: A Static Analysis Tool for Certifying Java Card Applications

    OpenAIRE

    Besson, Frédéric; Jensen, Thomas; Vittet, Pierre

    2014-01-01

    This paper describes the design and implementation of a static analysis tool for certifying Java Card applications, according to security rules defined by the smart card industry. Java Card is a dialect of Java designed for programming multi-application smart cards and the tool, called SawjaCard, has been specialised for the particular Java Card programming patterns. The tool is built around a static analysis engine which uses a combination of numeric and heap analysis. It includes a model of...

  1. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  2. Temporal Correlations In Natural Time Analysis and Tsallis Non Extensive Statistical Mechanics

    Science.gov (United States)

    Sarlis, N. V.; Varotsos, P.; Skordas, E. S.

    2015-12-01

    Upon analyzing the seismic catalog in a new time domain termed natural time[1-3] and employing a sliding natural time window comprising a number of events that would occur in a few months, we find that the fluctuations β of the order parameter of seismicity[4] show a minimum βmin a few months before major earthquakes (EQs)[5,6]. Such a minimum appears simultaneously[7] with the initiation of Seismic Electric Signals activity[8] being the first time in which two geophysical observables of different nature exhibit simultaneous anomalous behavior before major EQs. In addition, we show[9] that each precursory βmin is preceded as well as followed by characteristic changes of temporal correlations between EQ magnitudes identified by the celebrated Detrended Fluctuation Analysis of magnitude time series. We indicate that Tsallis non extensive statistical mechanics[10], in the frame of which kappa distributions arise[11], can capture temporal correlations between EQ magnitudes if complemented with natural time analysis [12]. References P.A. Varotsos, N.V. Sarlis, and E.S. Skordas, Phys Rev E, 66 (2002) 011902. P.A. Varotsos et al., Phys Rev E 72 (2005) 041103. Varotsos P. A., Sarlis N. V. and Skordas E. S., Natural Time Analysis: The new view of time. (Springer-Verlag, Berlin Heidelberg) 2011. N. V. Sarlis, E. S. Skordas and P. A. Varotsos, EPL 91 (2010) 59001. N. V. Sarlis et al., Proc Natl Acad Sci USA 110 (2013) 13734. N. V. Sarlis et al., Proc Natl Acad Sci USA 112 (2015) 986. P. A. Varotsos et al., Tectonophysics, 589 (2013) 116. P. Varotsos and M. Lazaridou, Tectonophysics 188 (1991) 321. P. A. Varotsos, N. V. Sarlis, and E. S. Skordas, J Geophys Res Space Physics, 119 (2014), 9192, doi: 10.1002/2014JA0205800. C. Tsallis, J Stat Phys 52 (1988) 479. G. Livadiotis, and D. J. McComas, J Geophys Res 114 (2009) A11105, doi:10.1029/2009JA014352. N. V. Sarlis, E. S. Skordas and P. A. Varotsos, Phys Rev E 82 (2010) 021110.

  3. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  4. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  5. Hybrid modular tooling design methodology, based on manufacturability analysis

    OpenAIRE

    Kerbrat, Olivier

    2009-01-01

    In order to stay competitive, manufacturers have to develop new products in a very short time and with reduced costs, whereas customers require more and more quality and flexibility. The field of tooling (dies and molds) does not break these constraints and one possibility to improve competitiveness is to design and manufacture tools with modular and hybrid points of views. Instead of a one-piece tool, it is seen as a 3-D puzzle with modules realized separately and further assembled. With the...

  6. Online Analysis of Wind and Solar Part II: Transmission Tool

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  7. Online Analysis of Wind and Solar Part I: Ramping Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  8. Extending a teleradiology system by tools for 3D-visualization and volumetric analysis through a plug-in mechanism.

    Science.gov (United States)

    Evers, H; Mayer, A; Engelmann, U; Schröter, A; Baur, U; Wolsiffer, K; Meinzer, H P

    1998-01-01

    This paper describes ongoing research concerning interactive volume visualization coupled with tools for volumetric analysis. To establish an easy to use application, the 3D-visualization has been embedded in a state of the art teleradiology system, where additional functionality is often desired beyond basic image transfer and management. Major clinical requirements for deriving spatial measures are covered by the tools, in order to realize extended diagnosis support and therapy planning. Introducing the general plug-in mechanism this work exemplarily describes the useful extension of an approved application. Interactive visualization was achieved by a hybrid approach taking advantage of both the precise volume visualization based on the Heidelberg Raytracing Model and the graphics acceleration of modern workstations. Several tools for volumetric analysis extend the 3D-viewing. They offer 3D-pointing devices to select locations in the data volume, measure anatomical structures or control segmentation processes. A haptic interface provides a realistic perception while navigating within the 3D-reconstruction. The work is closely related to research work in the field of heart, liver and head surgery. In cooperation with our medical partners the development of tools as presented proceed the integration of image analysis into clinical routine. PMID:10384617

  9. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  10. Applied climate-change analysis: the climate wizard tool.

    Directory of Open Access Journals (Sweden)

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  11. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    International Nuclear Information System (INIS)

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  12. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    Energy Technology Data Exchange (ETDEWEB)

    Jodoin, Vincent J [ORNL; Lee, Ronald W [ORNL; Peplow, Douglas E. [ORNL; Lefebvre, Jordan P [ORNL

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  13. Analytical (mathematical) predictive modeling in fiber optics structural analysis (FOSA): review and extension

    Science.gov (United States)

    Suhir, Ephraim

    2015-03-01

    An updated version of the paper with revised references has been published The review part of the paper addresses analytical (mathematical) modeling in structural analysis in fiber optics engineering, mostly fiber optics interconnects, and deals with optical fibers subjected to thermal and/or mechanical loading (stresses) in bending, tension, compression, or to the combinations of such loadings. Attributes and significance of predictive modeling are indicated and discussed. The review is based mostly on the author's research conducted at Bell Laboratories, Physical Sciences and Engineering Research Division, Murray Hill, NJ, USA, during his tenure with this company, and, to a lesser extent, on his recent work in the field. The addressed structures include, but are not limited to, optical fibers of finite length: bare fibers; jacketed and dual-coated fibers; fibers experiencing thermal loading; fibers soldered into ferrules or adhesively bonded into capillaries; as well as the roles of geometric and material non-linearity; dynamic response to shocks and vibrations; and possible applications of nano-materials in new generations of coating and cladding systems. The extension part is concerned with a novel, fruitful and challenging directionprobabilistic design for reliability (PDfR) of opto-electronic and photonic products, including optical fibers and interconnects. The rationale behind the PDfR concept is that there is no such thing as zero probability of failure, that the difference between a highly reliable product and an insufficiently reliable product is "merely" in the level of the never zero probability of its failure and that when the operational performance of the product is imperative, the ability to predict, quantify, assure and, if possible and appropriate, even specify its reliability is highly desirable. Accordingly, the objective of the PDfR effort is to quantify the likelihood of an operational failure of a material, device or a system, including the

  14. From extensive micro-stability analysis of experiments to integrated modelling

    International Nuclear Information System (INIS)

    So far, interpretative or predictive simulations of fusion plasmas are performed using either empirical transport models or gyro-fluid, mixed gyro-fluid/gyro-kinetic models. The most complete turbulence simulations require full non-linear gyro-kinetic codes. However, such codes cannot be coupled to an integrated simulation since they are still too demanding in terms of computing time. Nevertheless, fast linear gyro-kinetic codes can provide quasi-linear heat, particle and momentum fluxes that can be then coupled to transport codes. This approach allows improving significantly turbulent transport interpretation and prediction. Indeed, linear gyro-kinetic codes predict precisely the turbulence threshold and its parametric behaviour. They also allow clarifying basic transport scaling. For this purpose, we used the code Kinezero [C. Bourdelle et al., Nucl. Fusion 42, 892 (2002)] coupled to the integrated transport code CRONOS [V. Basiuk et al., Nucl. Fusion 43, 822-830 (2003)]. In the first part of the paper, an extensive analysis of experiments is reported, mainly for advanced scenarios performed in existing tokamak devices. The impacts of the key parameters on the plasma performance are analysed, namely the magnetic shear s and the MHD a parameters [C. Bourdelle et al., Nucl. Fusion 45, 110-130 (2005)]. The a stabilization reinforced by high pressure gradient, vertical bar ∇P vertical bar, is particularly discussed. In the second part of the paper, first results of CRONOS interpretative/predictive simulations are presented. In these simulations, we use a transport model derived from quasi-linear heat, particle (including impurity) and momentum fluxes provided by Kinezero. A possibility to navigate the plasma into the positive feedback loop thanks to the a stabilization is also discussed. Finally, we report the consequences of such an approach on transport prediction for ITER. (author)

  15. Analysis of Marketing Mix Tools in a Chosen Company

    OpenAIRE

    Havlíková, Žaneta

    2008-01-01

    The complex structure of the marketing mix of the Velteko s.r.o. company is analysed in my work. Characterising of the concern, basic analysing of the individual tools of the marketing mix and clients survey evaluating is the main object. Conclusion is focused on the rating of the level and the efficiency of the utilization of the used tools. Relevant recommendations are added.

  16. Efficiency of village extension agents in Nigeria: Evidence from a data envelopment analysis

    OpenAIRE

    Ibrahim Hassan I.; Salau Emmanuel S.

    2016-01-01

    Determining the technical efficiency of extension personnel especially at the village level is paramount if farm productivity is to be increased. The present study determined the technical efficiency of Village Extension Agents (VEAs) in North Central Nigeria. Data for the study were collected using structured questionnaire that was administered on 81 VEAs. The findings of the study indicated that 32.1% of the VEAs were aged between 38 and 45 years with a m...

  17. ANALYSIS OF THE ASSOCIATION BETWEEN COMPETENCE AND PERFORMANCE-FOCUSING ON FARMERS AND EXTENSION WORKERS

    OpenAIRE

    Sue-Ho Chae; Yoon-Doo Kim; Hae-Jin Lim

    2014-01-01

    The purpose of this study is to investigate the association between the competence of farmers and extension workers and the performance of the agricultural organizations in which they are involved. To this end, 20 competences of farmers and agricultural extension workers (10 of each), based on preceding studies, were selected as the independent variables. The dependent variable was defined as the process performance of the agricultural organization. Control variables were also selected for ea...

  18. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    Science.gov (United States)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center

  19. Jitterbug and TrueTime: Analysis Tools for Real-Time Control Systems

    OpenAIRE

    Cervin, Anton; Henriksson, Dan; Lincoln, Bo; Årzén, Karl-Erik

    2002-01-01

    The paper presents two recently developed, Matlab-based analysis tools for realtime control systems. The first tool, called Jitterbug, is used to compute a performance criterion for a control loop under various timing conditions. The tool makes it easy to quickly judge how sensitive a controller is to implementation effects such as slow sampling, delays, jitter, etc. The second tool, called TrueTime, allows detailed co-simulation of process dynamics, control task execution, and network commun...

  20. Virtual experiments as a data analysis tool for neutron scattering measurements

    OpenAIRE

    Farhi, Emmanuel

    2014-01-01

    This work demonstrates the use, and the limitations, of virtual experiments with applications to a liquid He4 ultra-cold neutron moderator study, a liquid Indium neutron inelastic experiment, and a neutron Rietveld refinement methodology. Extension to other topics are discussed.The main tools used for this study are McStas and iFit .

  1. Capacity Building and Community Resilience: A Pilot Analysis of Education and Employment Indicators before and after an Extension Intervention

    Science.gov (United States)

    Weaver, Russell

    2016-01-01

    This article reports on an analysis of the effects of a quasinatural experiment in which 16 rural communities participated in public discussion, leadership training, and community visioning as part of an Extension program at Montana State University. Difference-in-differences methods reveal that key U.S. Census socioeconomic indicators either…

  2. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    Science.gov (United States)

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique. PMID:27275119

  3. Toward a coherent set of radiative transfer tools for the analysis of planetary atmospheres .

    Science.gov (United States)

    Grassi, D.; Ignatiev, N. I.; Zasova, L. V.; Piccioni, G.; Adriani, A.; Moriconi, M. L.; Sindoni, G.; D'Aversa, E.; Snels, M.; Altieri, F.; Migliorini, A.; Stefani, S.; Politi, R.; Dinelli, B. M.; Geminale, A.; Rinaldi, G.

    The IAPS experience in the field of analysis of planetary atmospheres from visual and infrared measurements dates back to the early '90 in the frame of the IFSI participation to the Mars96 program. Since then, the forward models as well as retrieval schemes have been constantly updated and have seen a large usage in the analysis of data from Mars Express, Venus Express and Cassini missions. At the eve of a new series of missions (Juno, ExoMars, JUICE), we review the tools currently available to the Italian community, the latest developments and future perspectives. Notably, recent reanalysis of PFS-MEX and VIRTIS-VEX data \\citep{Grassi2014} leaded to a full convergence of complete Bayesian retrieval schemes and approximate forward models, achieving a degree of maturity and flexibility quite close to the state-of-the-art NEMESIS package \\citep{Irwin2007}. As a test case, the retrieval code for the JIRAM observations of hot-spots will be discussed, with extensive validation against simulated observations.

  4. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  5. Expressed sequence tags as a tool for phylogenetic analysis of placental mammal evolution.

    Directory of Open Access Journals (Sweden)

    Morgan Kullberg

    Full Text Available BACKGROUND: We investigate the usefulness of expressed sequence tags, ESTs, for establishing divergences within the tree of placental mammals. This is done on the example of the established relationships among primates (human, lagomorphs (rabbit, rodents (rat and mouse, artiodactyls (cow, carnivorans (dog and proboscideans (elephant. METHODOLOGY/PRINCIPAL FINDINGS: We have produced 2000 ESTs (1.2 mega bases from a marsupial mouse and characterized the data for their use in phylogenetic analysis. The sequences were used to identify putative orthologous sequences from whole genome projects. Although most ESTs stem from single sequence reads, the frequency of potential sequencing errors was found to be lower than allelic variation. Most of the sequences represented slowly evolving housekeeping-type genes, with an average amino acid distance of 6.6% between human and mouse. Positive Darwinian selection was identified at only a few single sites. Phylogenetic analyses of the EST data yielded trees that were consistent with those established from whole genome projects. CONCLUSIONS: The general quality of EST sequences and the general absence of positive selection in these sequences make ESTs an attractive tool for phylogenetic analysis. The EST approach allows, at reasonable costs, a fast extension of data sampling from species outside the genome projects.

  6. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  7. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  8. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Directory of Open Access Journals (Sweden)

    Marilyn Wilhelmina Leonora Monster

    2015-12-01

    Full Text Available The multispecimen protocol (MSP is a method to estimate the Earth’s magnetic field’s past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA, that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected calculated following Dekkers and Böhnel (2006 and Fabian and Leonhardt (2010 and a number of other parameters proposed by Fabian and Leonhardt (2010, it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM and the partial thermoremanent magnetization (pTRM gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  9. Fatigue analysis-based numerical design of stamping tools made of cast iron

    OpenAIRE

    Ben Slima, Khalil; Penazzi, Luc; Mabru, Catherine; Ronde-Oustau, François

    2013-01-01

    International audience This work concerns stress and fatigue analysis of stamping tools made of cast iron with an essentially pearlitic matrix and containing foundry defects. Our approach consists at first, in coupling the stamping numerical processing simulations and structure analysis in order to improve the tool stiffness geometry for minimizing the stress state and optimizing their fatigue lifetime. The method consists in simulating the stamping process by considering the tool as a per...

  10. WOPANets: a tool for WOrst case performance analysis of embedded networks

    OpenAIRE

    Mifdaoui, Ahlem; Ayed, Hamdi

    2010-01-01

    WOPANets (WOrst case Performance Analysis of embedded Networks) is a design aided-decision tool for embedded networks. This tool offers an ergonomic interface to the designer to describe the network and the circulating traffic and embodies a static performance evaluation technique based on the Network Calculus theory combined to optimization analysis to support early system level design exploration for embedded networks. In this paper, we describe the features set of WOPANets tool and we p...

  11. Designing an Exploratory Text Analysis Tool for Humanities and Social Sciences Research

    OpenAIRE

    Shrikumar, Aditi

    2013-01-01

    This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands of text-based source documents of interest from which they extract evidence for complex arguments about society and culture. These collections are...

  12. A Change Oriented Extension of EOF Analysis Applied to the 1996-1997 AVHRR Sea Surface Temperature Data

    OpenAIRE

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of orthogonal transformations to detect multivariate change in the monthly mean sea surface temperature (SST) as given by the NOAA/NASA Oceans Pathfinder data. The transforms applied include multivariate alteration detection (MAD) variates based on canonical correlation analysis, and maximum autocorrelation factors (MAFs). The method described can be considered as an extension to EOF analysis that is specially tailored for change detection in spatial data ...

  13. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  14. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  15. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot Struct...

  16. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.;

    2013-01-01

    “DryPack” is a calculation tool that visualises the energy consumption of airbased and superheated steam drying processes. With “DryPack”, it is possible to add different components to a simple drying process, and thereby increase the flexibility, which makes it possible to analyse the most common...... energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram...... with temperatures above 100°C may be generated....

  17. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    Science.gov (United States)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  18. Pathway-PDT: a flexible pathway analysis tool for nuclear families

    OpenAIRE

    Park, Yo Son; Schmidt, Michael; Martin, Eden R.; Pericak-Vance, Margaret A.; Chung, Ren-Hua

    2013-01-01

    Background Pathway analysis based on Genome-Wide Association Studies (GWAS) data has become popular as a secondary analysis strategy. Although many pathway analysis tools have been developed for case–control studies, there is no tool that can use all information from raw genotypes in general nuclear families. We developed Pathway-PDT, which uses the framework of Pedigree Disequilibrium Test (PDT) for general family data, to perform pathway analysis based on raw genotypes in family-based GWAS....

  19. Deformation analysis of Hoffa's fat pad from CT images of knee flexion and extension

    Science.gov (United States)

    Hamarneh, Ghassan; Chu, Vincent; Bordalo-Rodrigues, Marcelo; Schweitzer, Mark

    2005-04-01

    Recent advances in medicine conjecture that certain body fat may have mechanical function in addition to its classical role of energy storage. In particular we aim to analyze if the intra-articular fat pad of Hoffa is merely a space holder or if it changes shape to provide cushioning for the knee bones. Towards this goal, 3D CT images of real knees, as well as a skeletal knee model with fat simulating Hoffa's pad, were acquired in both extension and flexion. Image segmentation was performed to automatically extract the real and simulated fat regions from the extension and flexion images. Utilizing the segmentation results as binary masks, we performed automatic multi-resolution image registration of the fat pad between flexed and extended knee positions. The resulting displacement fields from flexion-extension registration are examined and used to calculate local fat volume changes thus providing insight into shape changes that may have a mechanical component.

  20. Extensions of indication throughout the drug product lifecycle: a quantitative analysis.

    Science.gov (United States)

    Langedijk, Joris; Whitehead, Christopher J; Slijkerman, Diederick S; Leufkens, Hubert G M; Schutjens, Marie-Hélène D B; Mantel-Teeuwisse, Aukje K

    2016-02-01

    The marketing authorisation of the first generic product version is an important moment in a drug product lifecycle. The subsequently changed intellectual property protection prospects could affect the incentives for further drug development. We assessed the quantity and nature of extensions of indication of small molecule medicinal products authorised through the European Medicines Agency throughout the drug product lifecycle with special attention for the impact of the introduction of a first generic competitor. The majority (92.5%) of the extensions of indication was approved during the exclusivity period of the innovator product. Regulatory rethinking might be needed for a sustainable stimulation of extensions of indications in the post-generic period of a drug product lifecycle. PMID:26657087

  1. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. PMID:26679001

  2. INTEGRATED ANALYSIS AND TOOLS FOR LAND SUBSIDENCE SURVEYING AND MONITORING: A SEMI-QUANTITATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    A. Mosconi

    2015-10-01

    Full Text Available This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization, an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U., where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative “differentiation” of the natural and anthropic component of subsidence

  3. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    Science.gov (United States)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more

  4. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  5. An Analysis of "In-Depth" Schools Conducted by Area Extension Agents in the Agricultural Industry.

    Science.gov (United States)

    Cunningham, Clarence J.

    The Ohio Extension Service conducted "in-depth" schools on Dairy Genetics and Reproduction, Beef Cattle, Capital Management, and Fertilizer and Lime at area centers in Wooster, Defiance and Fremont, Washington Court House, and McConnellsville. Two thirds of the instructional staff were area agents; others were specialists, resident staff, research…

  6. Using R-Project for Free Statistical Analysis in Extension Research

    Science.gov (United States)

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  7. Phosphoproteome analysis of streptomyces development reveals extensive protein phosphorylation accompanying bacterial differentiation

    DEFF Research Database (Denmark)

    Manteca, Angel; Ye, Juanying; Sánchez, Jesús;

    2011-01-01

    bacteria encoding the largest number of eukaryotic type kinases, the biological role of protein phosphorylation in this bacterium has not been extensively studied before. In this issue, the variations of the phosphoproteome of S. coelicolor were characterized. Most distinct Ser/Thr/Tyr phosphorylation...

  8. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods......Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets...

  9. MACD - Analysis of weaknesses of the most powerful technical analysis tool

    Directory of Open Access Journals (Sweden)

    Sanel Halilbegovic

    2016-05-01

    Full Text Available Due to the huge popularization of the stock trading amongst youth, in the recent years more and more of trading and brokerage houses are trying to find a one ‘easy to understand’ tool for the novice traders.  Moving average convergence divergence seems to be the main pick and unfortunately inexperienced traders are relying on this one tool for analysis and trading of various securities.   In this paper, I will investigate the validity of MACD as the ‘magic wand’ when solely used in investment trading decision making.  The main limitation of this study is that it could be used more widely across industries and various sizes of companies, funds, and other trading instruments.

  10. Analysis of online quizzes as a teaching and assessment tool

    Directory of Open Access Journals (Sweden)

    Lorenzo Salas-Morera

    2012-03-01

    Full Text Available This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an isolated assessment tool, but also when integrated into a combined strategy, which support the overall programming of the subject. The results obtained during the five years of experimentation using online quizzes shows that such quizzes have a proven positive influence on students' academic performance. Furthermore, surveys conducted at the end of each course revealed the high value students accord to use of online quizzes in course instruction.

  11. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  12. Storybuilder-A tool for the analysis of accident reports

    International Nuclear Information System (INIS)

    As part of an ongoing effort by the ministry of Social Affairs and Employment of The Netherlands a research project is being undertaken to construct a causal model for the most commonly occurring scenarios related to occupational risk. This model should provide quantitative insight in the causes and consequences of occupational accidents. The results should be used to help selecting optimal strategies to reduce these risks taking the costs of accidents and of measures into account. The research is undertaken by an international consortium under the name of Workgroup Occupational Risk Model. One of the components of the model is a tool to systematically classify and analyse past accidents. This tool: 'Storybuilder' and its place in the Occupational Risk Model (ORM) are described in the paper. The paper gives some illustrations of the application of the Storybuilder, drawn from the study of ladder accidents, which forms one of the biggest single accident categories in the Dutch data

  13. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  14. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search......Efficient process monitoring and analysis tools provide the means for automated supervision and control of manufacturing plants and therefore play an important role in plant safety, process control and assurance of end product quality. The availability of a large number of different process...

  15. ProteinHistorian: tools for the comparative analysis of eukaryote protein origin.

    Directory of Open Access Journals (Sweden)

    John A Capra

    Full Text Available The evolutionary history of a protein reflects the functional history of its ancestors. Recent phylogenetic studies identified distinct evolutionary signatures that characterize proteins involved in cancer, Mendelian disease, and different ontogenic stages. Despite the potential to yield insight into the cellular functions and interactions of proteins, such comparative phylogenetic analyses are rarely performed, because they require custom algorithms. We developed ProteinHistorian to make tools for performing analyses of protein origins widely available. Given a list of proteins of interest, ProteinHistorian estimates the phylogenetic age of each protein, quantifies enrichment for proteins of specific ages, and compares variation in protein age with other protein attributes. ProteinHistorian allows flexibility in the definition of protein age by including several algorithms for estimating ages from different databases of evolutionary relationships. We illustrate the use of ProteinHistorian with three example analyses. First, we demonstrate that proteins with high expression in human, compared to chimpanzee and rhesus macaque, are significantly younger than those with human-specific low expression. Next, we show that human proteins with annotated regulatory functions are significantly younger than proteins with catalytic functions. Finally, we compare protein length and age in many eukaryotic species and, as expected from previous studies, find a positive, though often weak, correlation between protein age and length. ProteinHistorian is available through a web server with an intuitive interface and as a set of command line tools; this allows biologists and bioinformaticians alike to integrate these approaches into their analysis pipelines. ProteinHistorian's modular, extensible design facilitates the integration of new datasets and algorithms. The ProteinHistorian web server, source code, and pre-computed ages for 32 eukaryotic genomes are

  16. ProteinHistorian: tools for the comparative analysis of eukaryote protein origin.

    Science.gov (United States)

    Capra, John A; Williams, Alexander G; Pollard, Katherine S

    2012-01-01

    The evolutionary history of a protein reflects the functional history of its ancestors. Recent phylogenetic studies identified distinct evolutionary signatures that characterize proteins involved in cancer, Mendelian disease, and different ontogenic stages. Despite the potential to yield insight into the cellular functions and interactions of proteins, such comparative phylogenetic analyses are rarely performed, because they require custom algorithms. We developed ProteinHistorian to make tools for performing analyses of protein origins widely available. Given a list of proteins of interest, ProteinHistorian estimates the phylogenetic age of each protein, quantifies enrichment for proteins of specific ages, and compares variation in protein age with other protein attributes. ProteinHistorian allows flexibility in the definition of protein age by including several algorithms for estimating ages from different databases of evolutionary relationships. We illustrate the use of ProteinHistorian with three example analyses. First, we demonstrate that proteins with high expression in human, compared to chimpanzee and rhesus macaque, are significantly younger than those with human-specific low expression. Next, we show that human proteins with annotated regulatory functions are significantly younger than proteins with catalytic functions. Finally, we compare protein length and age in many eukaryotic species and, as expected from previous studies, find a positive, though often weak, correlation between protein age and length. ProteinHistorian is available through a web server with an intuitive interface and as a set of command line tools; this allows biologists and bioinformaticians alike to integrate these approaches into their analysis pipelines. ProteinHistorian's modular, extensible design facilitates the integration of new datasets and algorithms. The ProteinHistorian web server, source code, and pre-computed ages for 32 eukaryotic genomes are freely available under

  17. Evaluation of a Surface Exploration Traverse Analysis and Navigation Tool

    OpenAIRE

    Gilkey, Andrea L.; Galvan, Raquel Christine; Johnson, Aaron William; Kobrick, Ryan L.; Hoffman, Jeffrey A.; Melo, Paulo L.; Newman, Dava

    2011-01-01

    SEXTANT is an extravehicular activity (EVA) mission planner tool developed in MATLAB, which computes the most efficient path between waypoints across a planetary surface. The traverse efficiency can be optimized around path distance, time, or explorer energy consumption. The user can select waypoints and the time spent at each, and can visualize a 3D map of the optimal path. Once the optimal path is generated, the thermal load on suited astronauts or solar power generation of rovers is displa...

  18. Integration of management control tools. Analysis of a case study

    OpenAIRE

    Raúl Comas Rodríguez; Dianelys Nogueira Rivera; Félix Romero Bartutis; Marisdany Lumpuy Rodríguez

    2015-01-01

    The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard...

  19. Monitoring SOA Applications with SOOM Tools: A Competitive Analysis

    OpenAIRE

    Ivan Zoraja; Goran Trlin; Marko Matijević

    2013-01-01

    Background: Monitoring systems decouple monitoring functionality from application and infrastructure layers and provide a set of tools that can invoke operations on the application to be monitored. Objectives: Our monitoring system is a powerful yet agile solution that is able to online observe and manipulate SOA (Service-oriented Architecture) applications. The basic monitoring functionality is implemented via lightweight components inserted into SOA frameworks thereby keeping the monitoring...

  20. The Mission Planning Lab: A Visualization and Analysis Tool

    Science.gov (United States)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  1. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    OpenAIRE

    Tousif ur Rehman; Muhammad Naeem Ahmed Khan; Naveed Riaz

    2013-01-01

    Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The stu...

  2. SNMP Traffic Analysis: Approaches, Tools, and First Results

    OpenAIRE

    Schönwälder, Jürgen; Pras, Aiko; Harvan, Mat´uˇs; Schippers, Jorrit; Meent, van de, Remco

    2007-01-01

    The Simple Network Management Protocol (SNMP) is widely deployed to monitor, control, and configure network elements. Even though the SNMP technology is well documented and understood, it remains relatively unclear how SNMP is used in practice and what the typical SNMP usage patterns are. This paper discusses how to perform large-scale SNMP traffic measurements in order to develop a better understanding of how SNMP is used in production networks. The tools described in this paper have been ap...

  3. CeTA - A Tool for Certified Termination Analysis

    OpenAIRE

    Sternagel, Christian; Thiemann, René; Winkler, Sarah; Zankl, Harald

    2012-01-01

    Since the first termination competition in 2004 it is of great interest, whether a proof that has been automatically generated by a termination tool, is indeed correct. The increasing number of termination proving techniques as well as the increasing complexity of generated proofs (e.g., combinations of several techniques, exhaustive labelings, tree automata, etc.), make certifying (i.e., checking the correctness of) such proofs more and more tedious for humans. Hence the interest in automate...

  4. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    OpenAIRE

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...

  5. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    Science.gov (United States)

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-01

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem. PMID:26709623

  6. Matlab symbolic circuit analysis and simulation tool ming PSpice netlist for circuits optimization

    OpenAIRE

    Ushie, OJ; Abbod, M; Ashigwuike, E

    2015-01-01

    This paper presents new Matlab symbolic circuit analysis and simulation (MSCAM) tool that make uses of netlist from PSpice to generate matrices. These matrices can be used to calculate circuit parameters or for optimization. The tool can handle active and passive components such as resistors, capacitors, inductors, operational amplifiers, and transistors. The transistors are converted into small signal analysis and operational amplifiers make use of the small signal analysis which can easily ...

  7. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion

    2009-10-01

    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  8. EXTENSION OF SEQUENCE STRATIGRAPHIC MODEL AND SEQUENCE STRATIGRAPHC ANALYSIS IN LIMNIC DEPOSITIONAL BASIN

    Institute of Scientific and Technical Information of China (English)

    李增学; 魏久传; 王民镇; 李守春; 李青山; 金秀昆; 兰恒星

    1996-01-01

    The architectural patterns of sedimentary succession are diverse in different depositionalbasins. The sedimentary architecture and geological condition of such basins asepicontinental sea, intraplate limnic basins, etc., differ clearly from those of continentalmargin basin. Extension, complement and perfection of sequence stratigraphic models are needed in the studies ofvarious depositional basins based on the classical sequence model. This paper, for this reason,expounds the thought, principles of sequence division, methodology and technology of the studyof sequence stratigraphy in epicontinental and limnic basins.

  9. Analysis and control of welding deformation in Qinshan nuclear power phase II extension project

    International Nuclear Information System (INIS)

    The paper analyzes the severe deformation in the welding of core barrel in Qinshan Nuclear Power Phase II Extension Project Reactor No. 3 unit, which nearly induces the loss of the function of the core barrel. Measures such as improving the welding fixture,process and parameter, and loading the counterweight is taken for the No. 4 unit to minimize the deformation, and the result shows that the weld of the No. 4 core barrel satisfies the design requirements. (authors)

  10. Agricultural extension services and gender equality: An institutional analysis of four districts in Ethiopia

    OpenAIRE

    Cohen, Marc J.; Lemma, Mamusha

    2011-01-01

    Decentralized delivery of public services has been promoted as a means to enhance citizen voice and make service provision more responsive to users. Ethiopia has undertaken two rounds of decentralization, making first the regional states and then the district governments responsible for providing key public services. This paper explores whether decentralization has improved the quality of service delivery and citizen satisfaction with the services provided, focusing on agricultural extension....

  11. Efficiency of village extension agents in Nigeria: Evidence from a data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Ibrahim Hassan I.

    2016-01-01

    Full Text Available Determining the technical efficiency of extension personnel especially at the village level is paramount if farm productivity is to be increased. The present study determined the technical efficiency of Village Extension Agents (VEAs in North Central Nigeria. Data for the study were collected using structured questionnaire that was administered on 81 VEAs. The findings of the study indicated that 32.1% of the VEAs were aged between 38 and 45 years with a mean age of 41 years; while 50.6% were holders of national diploma certificates. The monthly income of a VEA ranged between N16,000 and N21,000. The average technical efficiency of VEAs was 42% with minimum and maximum values of 0.03 and 1 respectively. There was a positive significant association between the age (P<0.10, education (P<0. 10 and income (P<0.01 of VEAs and their technical efficiency levels. The results imply that prompt payment of allowances/salary, regular promotions and trainings are the necessary impetus that can improve agricultural extension service delivery in Nigeria, particularly at the village level.

  12. Multiscale Analysis of Surface Topography from Single Point Incremental Forming using an Acetal Tool

    International Nuclear Information System (INIS)

    Single point incremental forming (SPIF) is a sheet metal manufacturing process that forms a part by incrementally applying point loads to the material to achieve the desired deformations and final part geometry. This paper investigates the differences in surface topography between a carbide tool and an acetal-tipped tool. Area-scale analysis is performed on the confocal areal surface measurements per ASME B46. The objective of this paper is to determine at which scales surfaces formed by two different tool materials can be differentiated. It is found that the surfaces in contact with the acetal forming tool have greater relative areas at all scales greater than 5 × 104 μm2 than the surfaces in contact with the carbide tools. The surfaces not in contact with the tools during forming, also referred to as the free surface, are unaffected by the tool material

  13. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    using new technology developed by Böhler. All three steels have the same nominal composition of alloying elements. The failure in both types of material occurs as a crack formation at a notch inside ofthe tool. Generally the cold forging dies constructed in third generation steels have a longer lifetime...... than the ones constructed in traditional steel, which is connected to differences in micro-structure. Focus has been put on differences in the size anddistribution of car-bides. It is found that the third generation steel contains smaller and more finely dis-persed carbides and has an increased...

  14. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  15. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    OpenAIRE

    Wenjie Tian; Weiguo Gao; Wenfen Chang; Yingxin Nie

    2014-01-01

    Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy im...

  16. The Gender Analysis Tools Applied in Natural Disasters Management: A Systematic Literature Review

    OpenAIRE

    Sohrabizadeh, Sanaz; Tourani, Sogand; Khankeh, Hamid Reza

    2014-01-01

    Background: Although natural disasters have caused considerable damages around the world, and gender analysis can improve community disaster preparedness or mitigation, there is little research about the gendered analytical tools and methods in communities exposed to natural disasters and hazards. These tools evaluate gender vulnerability and capacity in pre-disaster and post-disaster phases of the disaster management cycle. Objectives: Identifying the analytical gender tools and the strength...

  17. Comparative transcriptional pathway bioinformatic analysis of dietary restriction, Sir2, p53 and resveratrol life span extension in Drosophila

    OpenAIRE

    Antosh, Michael; Whitaker, Rachel; Kroll, Adam; Hosier, Suzanne; Chang, Chengyi; Bauer, Johannes; Cooper, Leon; Neretti, Nicola; HELFAND, STEPHEN L.

    2011-01-01

    A multiple comparison approach using whole genome transcriptional arrays was used to identify genes and pathways involved in calorie restriction/dietary restriction (DR) life span extension in Drosophila. Starting with a gene centric analysis comparing the changes in common between DR and two DR related molecular genetic life span extending manipulations, Sir2 and p53, lead to a molecular confirmation of Sir2 and p53's similarity with DR and the identification of a small set of commonly regul...

  18. Symmetry Analysis and Conservation Laws to the (2+1)-Dimensional Coupled Nonlinear Extension of the Reaction-Diffusion Equation

    International Nuclear Information System (INIS)

    In this paper, a detailed Lie symmetry analysis of the (2+1)-dimensional coupled nonlinear extension of the reaction-diffusion equation is presented. The general finite transformation group is derived via a simple direct method, which is equivalent to Lie point symmetry group actually. Similarity reduction and some exact solutions of the original equation are obtained based on the optimal system of one-dimensional subalgebras. In addition, conservation laws are constructed by employing the new conservation theorem. (general)

  19. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  20. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for complex fluid/structure interaction phenomena is increasing as proven numerical and visualization...

  1. Hyperbolic Error Analysis and Parametric Optimization of Round Body Form Tool

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    with the merits of the easy manufacture and the long service life and the processing the inside or outside form surface, round body form tool is extensive use in large scales production. Its main demerit is the big hyperbolic error which is caused in the process of processing cone, but about the discussion of hyperbolic error, there are two drawbacks in the current books and documents: (1) The error measuring plane is established on the rake face of tool, which doesn't coincide with the actual measuring pl...

  2. Integrated structural analysis tool using the Linear Matching Method part 2 – Application and verification

    International Nuclear Information System (INIS)

    In an accompanying paper, a new integrated structural analysis tool using the Linear Matching Method framework for the assessment of design limits in plasticity including load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures was developed using Abaqus CAE plug-ins with graphical user interfaces. In the present paper, a demonstration of the use of this new Linear Matching Method analysis tool is provided. A header branch pipe in a typical advanced gas-cooled reactor power plant is analysed as a worked example of the current demonstration and verification of the Linear Matching Method tool within the context of an R5 assessment. The detailed shakedown analysis, steady state cycle and ratchet analysis are carried out for the chosen header branch pipe. The comparisons of the Linear Matching Method solutions with results based on the R5 procedure and step-by-step elastic–plastic finite element analysis verify the accuracy, convenience and efficiency of this new integrated Linear Matching Method structural analysis tool. - Highlights: • The demonstration of the use of a new LMM software tool is provided. • A header branch pipe in a typical AGR power plant is analysed as a worked example. • The verification of LMM software tool is within the context of an R5 assessment. • We include the R5 procedure and step-by-step elastic–plastic FEA for comparison. • We verify the accuracy, convenience and efficiency of this new integrated LMM tool

  3. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  4. Electromyographic Analysis of the Hip Extension Pattern in Visually Impaired Athletes

    Directory of Open Access Journals (Sweden)

    Halski Tomasz

    2015-12-01

    Full Text Available The objective of the study was to determine the order of muscle recruitment during the active hip joint extension in particular positions in young visually impaired athletes. The average recruitment time (ART of the gluteus maximus (GM and the hamstring muscle group (HMG was assessed by the means of surface electromyography (sEMG. The sequence of muscle recruitment in the female and male group was also taken into consideration. This study followed a prospective, cross – sectional, randomised design, where 76 visually impaired athletes between the age of 18–25 years were enrolled into the research and selected on chosen inclusion and exclusion criteria. Finally, 64 young subjects (32 men and 32 women were included in the study (age: 21.1 ± 1.05 years; body mass: 68.4 ± 12.4 kg; body height: 1.74 ± 0.09 m; BMI: 22.20 ± 2.25 kg/m2. All subjects were analysed for the ART of the GM and HMG during the active hip extension performed in two different positions, as well as resting and functional sEMG activity of each muscle. Between gender differences were comprised and the correlations between the ART of the GM and HMG with their functional sEMG activity during hip extension in both positions were shown. No significant differences between the ART of the GM and HMG were found (p>0.05. Furthermore, there was no significant difference of ART among both tested positions, as well in male as female subjects (p>0.05.

  5. Electromyographic Analysis of the Hip Extension Pattern in Visually Impaired Athletes.

    Science.gov (United States)

    Halski, Tomasz; Żmijewski, Piotr; Cięszczyk, Paweł; Nowak, Barbara; Ptaszkowski, Kuba; Slupska, Lucyna; Dymarek, Robert; Taradaj, Jakub

    2015-11-22

    The objective of the study was to determine the order of muscle recruitment during the active hip joint extension in particular positions in young visually impaired athletes. The average recruitment time (ART) of the gluteus maximus (GM) and the hamstring muscle group (HMG) was assessed by the means of surface electromyography (sEMG). The sequence of muscle recruitment in the female and male group was also taken into consideration. This study followed a prospective, cross - sectional, randomised design, where 76 visually impaired athletes between the age of 18-25 years were enrolled into the research and selected on chosen inclusion and exclusion criteria. Finally, 64 young subjects (32 men and 32 women) were included in the study (age: 21.1 ± 1.05 years; body mass: 68.4 ± 12.4 kg; body height: 1.74 ± 0.09 m; BMI: 22.20 ± 2.25 kg/m2). All subjects were analysed for the ART of the GM and HMG during the active hip extension performed in two different positions, as well as resting and functional sEMG activity of each muscle. Between gender differences were comprised and the correlations between the ART of the GM and HMG with their functional sEMG activity during hip extension in both positions were shown. No significant differences between the ART of the GM and HMG were found (p>0.05). Furthermore, there was no significant difference of ART among both tested positions, as well in male as female subjects (p>0.05). PMID:26834873

  6. HAWCStab2 with super element foundations: A new tool for frequency analysis of offshore wind turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Hansen, Anders Melchior; Kragh, Knud Abildgaard; Yde, Anders

    HAWCStab2 is a linear frequency domain aero-elastic tool, developed by DTU Wind Energy, suitable for frequency and stability analysis of horizontal axis 3 bladed wind turbines [1]. This tool has now been extended to also handle complex offshore foundation types, such as jacket structures and...

  7. Analysis of the extensive air showers of ultra-high energy

    International Nuclear Information System (INIS)

    When we study extensive air showers (EAS), which correlate with pulsars, we had been found showers without a muon component. Here we analyzed the arrival directions of EAS with poor and without a muon component. We find that the arrival directions of these showers correlate with some pulsars which are distributed more isotropy. Among these pulsars with the short period of rotation around their axis are prevailed than it is expected by the catalogue of pulsars. In this connection the data of world arrays are considered.

  8. Technical analysis as a tool of market timing

    International Nuclear Information System (INIS)

    Risk management teams carry a high burden because businesses have to compete on a global scale. Timing is everything, once a decision to hedge in the futures has been made. How to determine when it is a good time to buy or sell, and be certain that this information has not already been factored into the price is tricky. There are two schools: fundamental analysis and technical analysis. Statistics and supply and demand data are examined to determine why in fundamental analysis, while technical analysis determines when by interpreting charts. With the help of charts that were displayed, the author examined the price of crude oil in 1990 and 1991 in an attempt to demonstrate technical analysis. Technical analysis is based on mathematics, but it is more art than science. It looks at market patterns that repeat themselves endlessly. Markets almost always enter into a period of consolidation or distribution when they break out. Additional charts displaying the price of crude oil for various periods from 1986 to 1991 were presented and analysed. The author concluded by stating that technical analysis provides visual guidelines to hedgers and traders to assist them in making intelligent forecasts about price and risk. figs

  9. Limits, limits everywhere the tools of mathematical analysis

    CERN Document Server

    Applebaum, David

    2012-01-01

    A quantity can be made smaller and smaller without it ever vanishing. This fact has profound consequences for science, technology, and even the way we think about numbers. In this book, we will explore this idea by moving at an easy pace through an account of elementary real analysis and, in particular, will focus on numbers, sequences, and series.Almost all textbooks on introductory analysis assume some background in calculus. This book doesn't and, instead, the emphasis is on the application of analysis to number theory. The book is split into two parts. Part 1 follows a standard university

  10. ATLASWatchMan, a tool for automatized data analysis

    International Nuclear Information System (INIS)

    The ATLAS detector will start soon to take data and many New Physics phenomena are expected. The ATLASWatchMan package has been developed with the principles of CASE (Computer Aided Software Engineering) and it helps the user setting up any analysis by automatically generating the actual analysis code and data files from user settings. ATLASWatchMan provides a light and transparent framework to plug in user-defined cuts and algorithms to look at as many channels the user wants, running the analysis both locally and on the Grid. Examples of analyses run with the package using the latest release of the ATLAS software are shown

  11. A Change Oriented Extension of EOF Analysis Applied to the 1996-1997 AVHRR Sea Surface Temperature Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of orthogonal transformations to detect multivariate change in the monthly mean sea surface temperature (SST) as given by the NOAA/NASA Oceans Pathfinder data. The transforms applied include multivariate alteration detection (MAD) variates based on canonical...... correlation analysis, and maximum autocorrelation factors (MAFs). The method described can be considered as an extension to EOF analysis that is specially tailored for change detection in spatial data since it first maximises differences in the data between two points in time and then maximises...

  12. Company Management Improvement Using Electronic Tools for Analysis of Employees’ Feedback

    OpenAIRE

    Toteva, Kostadinka; Gourova, Elissaveta

    2010-01-01

    The paper considers the importance of the proper human capital management for the success of organisations in the knowledge society. It provides theoretical background for human resources management and the approaches applied for ensuring employees’ commitment and motivation. On this base is proposed a new electronic tool as an extension of existing human resource management software with the objective to collect objective and subjective feedback from employees needed for the proper human res...

  13. Concurrent Software Testing : A Systematic Review and an Evaluation of Static Analysis Tools

    OpenAIRE

    Mamun, Md. Abdullah al; Khanam, Aklima

    2009-01-01

    Verification and validation is one of the most important concerns in the area of software engineering towards more reliable software development. Hence it is important to overcome the challenges of testing concurrent programs. The extensive use of concurrent systems warrants more attention to the concurrent software testing. For testing concurrent software, automatic tools development is getting increased focus. The first part of this study presents a systematic review that aims to explore th...

  14. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  15. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  16. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez

    2015-09-01

    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  17. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  18. Silver Solid Amalgam Electrodes - Tools of Choice in DNA Analysis

    Czech Academy of Sciences Publication Activity Database

    Fadrná, Renata; Josypčuk, Bohdan; Fojta, Miroslav

    Galway : National University of Ireland , 2004. s. 119. [International Conference on Electroanalysis /10./. 06.06.2004-10.06.2004, Galway] Keywords : silver solid amalgam * electrodes * DNA analysis Subject RIV: CG - Electrochemistry

  19. Design and Analysis Tools for Deployable Solar Array Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Large, lightweight, deployable solar array structures have been identified as a key enabling technology for NASA with analysis and design of these structures being...

  20. Dynamic testing and analysis of extension-twist-coupled composite tubular spars

    Science.gov (United States)

    Lake, Renee C.; Izapanah, Amir P.; Baucon, Robert M.

    The results from a study aimed at improving the dynamic and aerodynamic characteristics of composite rotor blades through the use of extension-twist elastic coupling are presented. A set of extension-twist-coupled composite tubular spars, representative of the primary load carrying structure within a helicopter rotor blade, was manufactured using four plies of woven graphite/epoxy cloth 'prepreg.' These spars were non-circular in cross section design and were therefore subject to warping deformations. Three cross-sectional geometries were developed: square, D-shape, and flattened ellipse. Results from free-free vibration tests of the spars were compared with results from normal modes and frequency analyses of companion shell-finite-element models developed in MSC/NASTRAN. Five global or 'non-shell' modes were identified within the 0-2000 Hz range for each spar. The frequencies and associated mode shapes for the D-shape spar were correlated with analytical results, showing agreement within 13.8 percent. Frequencies corresponding to the five global mode shapes for the square spar agreed within 9.5 percent of the analytical results. Five global modes were similarly identified for the elliptical spar and agreed within 4.9 percent of the respective analytical results.

  1. SOCIAL SENSOR: AN ANALYSIS TOOL FOR SOCIAL MEDIA

    OpenAIRE

    Chun-Hsiao Wu; Tsai-Yen Li

    2016-01-01

    In this research, we propose a new concept for social media analysis called Social Sensor, which is an innovative design attempting to transform the concept of a physical sensor in the real world into the world of social media with three design features: manageability, modularity, and reusability. The system is a case-centered design that allows analysts to select the type of social media (such as Twitter), the target data sets, and appropriate social sensors for analysis. By adopting paramet...

  2. Practical survival analysis tools for heterogeneous cohorts and informative censoring

    OpenAIRE

    Rowley, M; Garmo, H; Van Hemelrijck, M; Wulaningsih, W.; Grundmark, B.; Zethelius, B.; Hammar, N.; Walldius, G; M. Inoue; Holmberg, L; Coolen, A. C. C.

    2015-01-01

    In heterogeneous cohorts and those where censoring by non-primary risks is informative many conventional survival analysis methods are not applicable; the proportional hazards assumption is usually violated at population level and the observed crude hazard rates are no longer estimators of what they would have been in the absence of other risks. In this paper, we develop a fully Bayesian survival analysis to determine the probabilistically optimal description of a heterogeneous cohort and we ...

  3. Markov Chains as Tools for Jazz Improvisation Analysis

    OpenAIRE

    Franz, David Matthew

    1998-01-01

    This thesis describes an exploratory application of a statistical analysis and modeling technique (Markov chains) for the modeling of jazz improvisation with the intended subobjective of providing increased insight into an improviser's style and creativity through the postulation of quantitative measures of style and creativity based on the constructed Markovian analysis techniques. Using Visual Basic programming language, Markov chains of orders one to three are created using transcriptio...

  4. GEPAS, a web-based tool for microarray data analysis and interpretation

    OpenAIRE

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the...

  5. Interfacing interactive data analysis tools with the GRID: the PPDG CS-11 activity

    International Nuclear Information System (INIS)

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's 'remote access' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to 'Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services'. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments (ALICE, ATLAS, BaBar, D0, CMS, JLab, STAR, others welcome), to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed

  6. Applying observations of work activity in designing prototype data analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  7. Evaluation of fatigue damage in nuclear power plants: evolution and new tools of analysis

    International Nuclear Information System (INIS)

    This paper presents new fatigue mechanisms requiring analysis, tools developed for evaluation and the latest trends and studies that are currently working in the nuclear field, and allow proper management referring facilities the said degradation mechanism.

  8. Transportation routing analysis geographic information system - tragis, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental United States. This paper outlines some of the features available in this model. (authors)

  9. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  10. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    Science.gov (United States)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  11. SICOMAT : a system for SImulation and COntrol analysis of MAchine Tools

    OpenAIRE

    Gautier, Maxime; Pham, Minh Tu; Khalil, Wisama; Lemoine, Philippe; Poignet, Philippe

    2001-01-01

    This paper presents a software package for the simulation and the control analysis of machine tool axes. This package which is called SICOMAT (SImulation and COntrol analysis of MAchine Tools), provides a large variety of toolboxes to analyze the behavior and the control of the machine. The software takes into account several elements such as the flexibility of bodies, the interaction between several axes, the effect of numerical control and the availability to reduce models.

  12. Development of a task analysis tool to facilitate user interface design

    Science.gov (United States)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  13. Development of web based offsite consequence analysis program and periodic risk assessments for ILRT extension

    International Nuclear Information System (INIS)

    PSA is generally used to assess the relative risks posed by various types of operations and facilities, to understand the relative importance of the risk contributors, and to obtain insights on potential safety improvements. There is usually no pass fail criterion that must be met to allow operations to continue. A major goal of PSA is to gain insights that can be used to either minimize the probability of accidents or to minimize the consequences of accidents that might occur. According to NUREG 1493 which applies the PSA concept to an ILRT interval extension, an ILRT interval was extended from three in ten years to one in ten years in almost all US nuclear power plants. In addition, NUREG 1493 stated that there is an imperceptible increase in risk associated with ILRT intervals up to twenty years. Since then, many licensees began to submit requests for one-time ILRT interval extensions of 15 years. To permit permanent 15 year ILRT intervals under the existing NRC guidance, it was necessary to develop a standard method for supporting the risk impact assessment. Thus, NEI performed a project to develop a generic methodology for the risk impact assessment for ILRT interval extensions to 15 years using current performance data and risk informed guidance. The risk impact assessment is generally performed by MACCS II code, which was included in an international collaborative effort to compare predictions obtained from seven consequence codes: ARANO (Finland), CONDOR (UK), COSYMA (EU), LENA (Sweden), MACCS (United States), MECA2 (Spain), and OSCAAR (Japan). However, it costs lots of man power and efforts on using the MACCS II code, especially on collecting raw data for input files and on converting the raw data format. Accordingly, the web based OSCAP based on MACCS II code was developed to focus on automatic processing algorithm for handling the main input files with meteorological data, population distribution data and source term data. It is considered that the web

  14. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  15. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  16. Uraninite chemistry as forensic tool for provenance analysis

    International Nuclear Information System (INIS)

    Highlights: • Uraninite chemistry can be used as fingerprint and provenance tool. • U/Th ratio and total REE contents are good indicators of crystallisation temperature. • REE fractionation is strongly dependent on uraninite genesis. • Application to uraninite from the Witwatesrand Basin highlights its detrital nature. • Witwatersrand uraninite is derived from a variety of magmatic sources. - Abstract: Electron microprobe and laser ablation-inductively coupled plasma mass spectrometric (LA-ICPMS) analyses were carried out on individual uraninite grains from several localities worldwide, representing a variety of different U-deposit types ranging in age from Mesoarchaean to the Mesozoic. For the first time, concentration data on a comprehensive set of minor/trace elements in uraninite are presented, i.e. LA-ICPMS concentration data for Th, Si, Al, Fe, Mn, Ca, Mg, P, Ti, V, Cr, Co, Ni, Pb, Zn, As, rare earth elements (REE), Y, Zr, Nb, Mo, Ag, Ta, W, Bi, and Au. Most of these elements could be detected in significant quantities in many of the studied examples. The results obtained in this study, supplemented by previously published data on major element and REE concentrations, reveal systematic differences in uraninite composition between genetically different deposit types and also, for a given genetic type, between different locations. Low-temperature hydrothermal uraninite is marked by U/Th >1000, whereas high-temperature metamorphic and magmatic (granitic, pegmatitic) uraninite has U/Th <100. Our new data also confirm previous observations that low-temperature, hydrothermal uraninite has low total REE contents (<1 wt%) whereas higher temperature uraninite can contain as much as several percent total REE. Genetically different uraninite types can be further identified by means of different REE fractionation patterns. Systematic differences between primary uraninite from different localities could be also noted with respect to the abundances of especially

  17. SCALE 5: Powerful new criticality safety analysis tools

    International Nuclear Information System (INIS)

    Version 5 of the SCALE computer software system developed at Oak Ridge National Laboratory, scheduled for release in December 2003, contains several significant new modules and sequences for criticality safety analysis and marks the most important update to SCALE in more than a decade. This paper highlights the capabilities of these new modules and sequences, including continuous energy flux spectra for processing multigroup problem-dependent cross sections; one- and three-dimensional sensitivity and uncertainty analyses for criticality safety evaluations; two-dimensional flexible mesh discrete ordinates code; automated burnup-credit analysis sequence; and one-dimensional material distribution optimization for criticality safety. (author)

  18. AGRARIAM QUESTION, SOCIAL CONFLICTS IN THE COUNTRYSIDE AND RURAL EXTENSION: AN ANALYSIS OF CONTEMPORANY RURAL REALYTY

    Directory of Open Access Journals (Sweden)

    VITOR MACHADO

    2010-06-01

    Full Text Available This article is a brief reflection on the agrarian question in Brazil, focusing mainly on the social impacts caused by the modernization of agriculture deployed by the military in the last century, in the early 60s, known as the Green Revolution. This development policy, adopted by the military that favored only the large producers and rural entrepreneurs resulted in numerous conflicts in the field, which extended over several regions of Brazil. In this context, the article discusses the policies adopted by the national government to contain such conflicts, among them the creation of the Rural Worker Statute, the occupation of the Amazon and rural extension policies. At the end we emphasize the importance of the main public policies for family farming, such as Social Security and PRONAF (National Program to Strengthen Family Agriculture, which are the result of the struggle of social movements and labor from rural areas.

  19. The Hydrograph Analyst, an Arcview GIS Extension That Integrates Point, Spatial, and Temporal Data Provides A Graphical User Interface for Hydrograph Analysis

    International Nuclear Information System (INIS)

    The Hydrograph Analyst (HA) is an ArcView GIS 3.2 extension developed by the authors to analyze hydrographs from a network of ground-water wells and springs in a regional ground-water flow model. ArcView GIS integrates geographic, hydrologic, and descriptive information and provides the base functionality needed for hydrograph analysis. The HA extends ArcView's base functionality by automating data integration procedures and by adding capabilities to visualize and analyze hydrologic data. Data integration procedures were automated by adding functionality to the View document's Document Graphical User Interface (DocGUI). A menu allows the user to query a relational database and select sites which are displayed as a point theme in a View document. An ''Identify One to Many'' tool is provided within the View DocGUI to retrieve all hydrologic information for a selected site and display it in a simple and concise tabular format. For example, the display could contain various records from many tables storing data for one site. Another HA menu allows the user to generate a hydrograph for sites selected from the point theme. Hydrographs generated by the HA are added as hydrograph documents and accessed by the user with the Hydrograph DocGUI, which contains tools and buttons for hydrograph analysis. The Hydrograph DocGUI has a ''Select By Polygon'' tool used for isolating particular points on the hydrograph inside a user-drawn polygon or the user could isolate the same points by constructing a logical expression with the ArcView GIS ''Query Builder'' dialog that is also accessible in the Hydrograph DocGUI. Other buttons can be selected to alter the query applied to the active hydrograph. The selected points on the active hydrograph can be attributed (or flagged) individually or as a group using the ''Flag'' tool found on the Hydrograph DocGUI. The ''Flag'' tool activates a dialog box that prompts the user to select an attribute and ''methods'' or ''conditions'' that qualify

  20. Collaboration as a Tool to Improve Career and Technical Education: A Qualitative Study of Successful Collaboration among Extension Agents and Agricultural Science Teachers

    Science.gov (United States)

    Murphrey, Theresa Pesl; Miller, Kimberley A.; Harlin, Julie; Rayfield, John

    2011-01-01

    Collaboration among Extension agents and agricultural science teachers has the potential to increase the reach of both organizations to serve clientele in obtaining critical skills and knowledge important to Career and Technical Education. However, successful collaboration requires that barriers be minimized and aspects of facilitation be…

  1. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models

    Science.gov (United States)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  2. Decoding astrocyte heterogeneity: New tools for clonal analysis.

    Science.gov (United States)

    Bribián, A; Figueres-Oñate, M; Martín-López, E; López-Mascaraque, L

    2016-05-26

    The importance of astrocyte heterogeneity came out as a hot topic in neurosciences especially over the last decades, when the development of new methodologies allowed demonstrating the existence of big differences in morphological, neurochemical and physiological features between astrocytes. However, although the knowledge about the biology of astrocytes is increasing rapidly, an important characteristic that remained unexplored, until the last years, has been the relationship between astrocyte lineages and cell heterogeneity. To fill this gap, a new method called StarTrack was recently developed, a powerful genetic tool that allows tracking astrocyte lineages forming cell clones. Using StarTrack, a single astrocyte progenitor and its progeny can be specifically labeled from its generation, during embryonic development, to its final fate in the adult brain. Because of this specific labeling, astrocyte clones, exhibiting heterogeneous morphologies and features, can be easily analyzed in relation to their ontogenetic origin. This review summarizes how astrocyte heterogeneity can be decoded studying the embryonic development of astrocyte lineages and their clonal relationship. Finally, we discuss about some of the challenges and opportunities emerging in this exciting area of investigation. PMID:25917835

  3. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  4. Measures of radioactivity: a tool for understanding statistical data analysis

    OpenAIRE

    Montalbano, Vera; Quattrini, Sonia

    2012-01-01

    A learning path on radioactivity in the last class of high school is presented. An introduction to radioactivity and nuclear phenomenology is followed by measurements of natural radioactivity. Background and weak sources are monitored for days or weeks. The data are analyzed in order to understand the importance of statistical analysis in modern physics.

  5. GATA: a graphic alignment tool for comparative sequence analysis

    OpenAIRE

    Nix David A; Eisen Michael B

    2005-01-01

    Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collineari...

  6. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  7. ProbFAST: Probabilistic Functional Analysis System Tool

    Directory of Open Access Journals (Sweden)

    Oliveira Thiago YK

    2010-03-01

    Full Text Available Abstract Background The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.

  8. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  9. Economic modeling for life extension decision making

    International Nuclear Information System (INIS)

    This paper presents a methodology for the economic and financial analysis of nuclear plant life extension under uncertainty and demonstrates its use in a case analysis. While the economic and financial evaluation of life extension does not require new analytical tools, such studies should be based on the following three premises. First, the methodology should examine effects at the level of the company or utility system, because the most important economic implications of life extension relate to the altered generation system expansion plan. Second, it should focus on the implications of uncertainty in order to understand the factors that most affect life extension benefits and identify risk management efforts. Third, the methodology should address multiple objectives, at a minimum, both economic and financial objectives

  10. Design and Analysis Tools for Concurrent Blackboard Systems

    Science.gov (United States)

    McManus, John W.

    1991-01-01

    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  11. Automated simultaneous analysis phylogenetics (ASAP: an enabling tool for phlyogenomics

    Directory of Open Access Journals (Sweden)

    Lee Ernest K

    2008-02-01

    Full Text Available Abstract Background The availability of sequences from whole genomes to reconstruct the tree of life has the potential to enable the development of phylogenomic hypotheses in ways that have not been before possible. A significant bottleneck in the analysis of genomic-scale views of the tree of life is the time required for manual curation of genomic data into multi-gene phylogenetic matrices. Results To keep pace with the exponentially growing volume of molecular data in the genomic era, we have developed an automated technique, ASAP (Automated Simultaneous Analysis Phylogenetics, to assemble these multigene/multi species matrices and to evaluate the significance of individual genes within the context of a given phylogenetic hypothesis. Conclusion Applications of ASAP may enable scientists to re-evaluate species relationships and to develop new phylogenomic hypotheses based on genome-scale data.

  12. GAVO Tools for the Analysis of Stars and Nebulae

    CERN Document Server

    Rauch, Thomas

    2007-01-01

    Within the framework of the German Astrophysical Virtual Observatory (GAVO), we provide synthetic spectra, simulation software for the calculation of NLTE model atmospheres, as well as necessary atomic data. This will enable a VO user to directly compare observation and model-atmosphere spectra on three levels: The easiest and fastest way is the use of our pre-calculated flux-table grid in which one may inter- and extrapolate. For a more precise analysis of an abservation, the VO user may improve the fit to the observation by the calculation of individual model atmospheres with fine-tuned photospheric parameters via the WWW interface TMAW. The more experienced VO user may create own atomic-data files for a more detailed analysis and calculate model atmosphere and flux tables with these.

  13. SYSTID - A flexible tool for the analysis of communication systems.

    Science.gov (United States)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  14. Application of Multivariate Analysis Tools to Industrial Scale Fermentation Data

    OpenAIRE

    Mears, Lisa; Nørregård, Rasmus; Stocks, Stuart M.; Albæk, Mads O.; Sin, Gürkan; Gernaey, Krist; Villez, Kris

    2015-01-01

    The analysis of batch process data can provide insight into the process operation, and there is a vast amount of historical data available for data mining. Empirical modelling utilising this data is desirable where there is a lack of understanding regarding the underlying process (Formenti et al. 2014). This may be the case for fed-batch fermentation processes, where mechanistic modelling is challenging due to non-linear dynamics, and non-steady state operation. There is also a lack of sensor...

  15. BiologicalNetworks: visualization and analysis tool for systems biology

    OpenAIRE

    Baitaluk, Michael; Sedova, Mayya; Ray, Animesh; Gupta, Amarnath

    2006-01-01

    Systems level investigation of genomic scale information requires the development of truly integrated databases dealing with heterogeneous data, which can be queried for simple properties of genes or other database objects as well as for complex network level properties, for the analysis and modelling of complex biological processes. Towards that goal, we recently constructed PathSys, a data integration platform for systems biology, which provides dynamic integration over a diverse set of dat...

  16. Soldier Station: A Tool for Dismounted Infantry Analysis

    OpenAIRE

    Pratt, Shirley; Ohman, David; Brown, Steve; Galloway, John; Pratt, David

    1997-01-01

    Soldier Station is a networked, human-in-the-loop, virtual dismounted infantryman (DI) simulator with underlying constructive model algorithms for movement, detection, engagement, and damage assessment. It is being developed by TRADOC Analysis Center - White Sands Missile Range, New Mexico, to analyze DI issues pertaining to situational awareness, command and control, and tactics techniques and procedures. It is unique in its design to integrate virtual and constructive simulation...

  17. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    Energy Technology Data Exchange (ETDEWEB)

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  18. DYNAMICS ANALYSIS OF SPECIAL STRUCTURE OF MILLING-HEAD MACHINE TOOL

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The milling-bead machine tool is a sophisticated and high-quality machine tool of which the spindle system is made up of special multi-element structure. Two special mechanical configurations make the cutting performance of the machine tool decline. One is the milling head spindle supported on two sets of complex bearings. The mechanical dynamic rigidity of milling head structure is researched on designed digital prototype with finite element analysis(FEA) and modal synthesis analysis (MSA) for identifying the weak structures. The other is the ram structure hanging on milling head. The structure is researched to get dynamic performance on cutting at different ram extending positions. The analysis results on spindle and ram are used to improve the mechanical configurations and structure in design. The machine tool is built up with modified structure and gets better dynamic rigidity than it was before.

  19. The Effects of Methylphenidate on A Functional Analysis of Disruptive Behavior: A Replication and Extension

    OpenAIRE

    DiCesare, Anthony; McAdam, David B; Toner, Amy; Varrell, James

    2005-01-01

    In the present investigation, a functional analysis of the disruptive behavior of a 18-year-old man who had been diagnosed with attention deficit hyperactivity disorder and moderate mental retardation was conducted, both when he was taking methylphenidate and when he was not taking the medication. The results of this functional analysis demonstrated that the participant's disruptive behaviors were reinforced by access to attention only when he was not taking methylphenidate.

  20. Gender Analysis and Approaches to Gender Responsive Extension to Promote Quality Protein Maize (QPM) in Ethiopia

    OpenAIRE

    Gebreselassie, Kidist; De Groote, Hugo; Friesen, Dennis

    2013-01-01

    Improved technologies are important in improving agricultural productivity and food security. The NuME project aims at improving food security among rural households through the dissemination of quality protein maize varieties. However, the project has yet conducted a gender analysis, which this paper tries to address. The analysis is conducted based on literature review, key informant interviews, focus group discussions and gender audit of the implementation partners conducted in two kebeles...

  1. Time-frequency tools of signal processing for EISCAT data analysis

    Directory of Open Access Journals (Sweden)

    J. Lilensten

    Full Text Available We demonstrate the usefulness of some signal-processing tools for the EISCAT data analysis. These tools are somewhat less classical than the familiar periodogram, squared modulus of the Fourier transform, and therefore not as commonly used in our community. The first is a stationary analysis, "Thomson's estimate'' of the power spectrum. The other two belong to time-frequency analysis: the short-time Fourier transform with the spectrogram, and the wavelet analysis via the scalogram. Because of the highly non-stationary character of our geophysical signals, the latter two tools are better suited for this analysis. Their results are compared with both a synthetic signal and EISCAT ion-velocity measurements. We show that they help to discriminate patterns such as gravity waves from noise.

  2. Extension of monodimensional fuel performance codes to finite strain analysis using a Lagrangian logarithmic strain framework

    International Nuclear Information System (INIS)

    Highlights: • A simple extension of standard monodimensional fuel performance codes to finite strain is proposed. • Efficiency and reliability are demonstrated. • The logarithmic strain frameword proposed by Miehe et al. is introduced and discussed. - Abstract: This paper shows how the Lagrangian logarithmic strain framework proposed by Miehe et al. can be used to extend monodimensional fuel performance codes, written in the framework of the infinitesimal strain theory, to be able to cope with large deformation of the cladding, such as the ones observed in reactivity initiated accidents (RIA) or loss-of-coolant accidents (LOCA). We demonstrate that the changes only concern the mechanical behaviour integration step by a straightforward modification of the strains (inputs) and the stress (result). The proposed procedure has been implemented in the open-source MFront code generator developed within the PLEIADES platform to handle mechanical behaviours. Using the Alcyone performance code, we apply this procedure to a simulation case proposed within the framework of a recent benchmark on fuel performance codes by the OECD/NEA

  3. An enhanced single base extension technique for the analysis of complex viral populations.

    Directory of Open Access Journals (Sweden)

    Dale R Webster

    Full Text Available Many techniques for the study of complex populations provide either specific information on a small number of variants or general information on the entire population. Here we describe a powerful new technique for elucidating mutation frequencies at each genomic position in a complex population. This single base extension (SBE based microarray platform was designed and optimized using poliovirus as the target genotype, but can be easily adapted to assay populations derived from any organism. The sensitivity of the method was demonstrated by accurate and consistent readouts from a controlled population of mutant genotypes. We subsequently deployed the technique to investigate the effects of the nucleotide analog ribavirin on a typical poliovirus population through two rounds of passage. Our results show that this economical platform can be used to investigate dynamic changes occurring at frequencies below 1% within a complex nucleic acid population. Given that many key aspects of the study and treatment of disease are intimately linked to population-level genomic diversity, our SBE-based technique provides a scalable and cost-effective complement to both traditional and next generation sequencing methodologies.

  4. Whole Genome Analysis of 132 Clinical Saccharomyces cerevisiae Strains Reveals Extensive Ploidy Variation

    Science.gov (United States)

    Zhu, Yuan O.; Sherlock, Gavin; Petrov, Dmitri A.

    2016-01-01

    Budding yeast has undergone several independent transitions from commercial to clinical lifestyles. The frequency of such transitions suggests that clinical yeast strains are derived from environmentally available yeast populations, including commercial sources. However, despite their important role in adaptive evolution, the prevalence of polyploidy and aneuploidy has not been extensively analyzed in clinical strains. In this study, we have looked for patterns governing the transition to clinical invasion in the largest screen of clinical yeast isolates to date. In particular, we have focused on the hypothesis that ploidy changes have influenced adaptive processes. We sequenced 144 yeast strains, 132 of which are clinical isolates. We found pervasive large-scale genomic variation in both overall ploidy (34% of strains identified as 3n/4n) and individual chromosomal copy numbers (36% of strains identified as aneuploid). We also found evidence for the highly dynamic nature of yeast genomes, with 35 strains showing partial chromosomal copy number changes and eight strains showing multiple independent chromosomal events. Intriguingly, a lineage identified to be baker’s/commercial derived with a unique damaging mutation in NDC80 was particularly prone to polyploidy, with 83% of its members being triploid or tetraploid. Polyploidy was in turn associated with a >2× increase in aneuploidy rates as compared to other lineages. This dataset provides a rich source of information on the genomics of clinical yeast strains and highlights the potential importance of large-scale genomic copy variation in yeast adaptation. PMID:27317778

  5. Extension of monodimensional fuel performance codes to finite strain analysis using a Lagrangian logarithmic strain framework

    Energy Technology Data Exchange (ETDEWEB)

    Helfer, Thomas

    2015-07-15

    Highlights: • A simple extension of standard monodimensional fuel performance codes to finite strain is proposed. • Efficiency and reliability are demonstrated. • The logarithmic strain frameword proposed by Miehe et al. is introduced and discussed. - Abstract: This paper shows how the Lagrangian logarithmic strain framework proposed by Miehe et al. can be used to extend monodimensional fuel performance codes, written in the framework of the infinitesimal strain theory, to be able to cope with large deformation of the cladding, such as the ones observed in reactivity initiated accidents (RIA) or loss-of-coolant accidents (LOCA). We demonstrate that the changes only concern the mechanical behaviour integration step by a straightforward modification of the strains (inputs) and the stress (result). The proposed procedure has been implemented in the open-source MFront code generator developed within the PLEIADES platform to handle mechanical behaviours. Using the Alcyone performance code, we apply this procedure to a simulation case proposed within the framework of a recent benchmark on fuel performance codes by the OECD/NEA.

  6. Materiality, Description and Comparison as Tools for Cultural Difference Analysis

    OpenAIRE

    Zimmermann, Basile

    2013-01-01

    Working in a Chinese studies department based in Europe, I am often confronted with the challenges not only of working with cultural difference, but also of working with the concept of “culture” in itself – one of the most famously difficult concepts in the social sciences and humanities. Further, recent socioeconomic changes in China—and the new media dynamics of the “Chinese Internet”—have produced new situations requiring socio-cultural analysis, but lacking a clear theoretical or methodol...

  7. Stakeholder analysis: a useful tool for biobank planning.

    Science.gov (United States)

    Bjugn, Roger; Casati, Bettina

    2012-06-01

    Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway. PMID:24835062

  8. SCit: web tools for protein side chain conformation analysis.

    Science.gov (United States)

    Gautier, R; Camproux, A-C; Tufféry, P

    2004-07-01

    SCit is a web server providing services for protein side chain conformation analysis and side chain positioning. Specific services use the dependence of the side chain conformations on the local backbone conformation, which is described using a structural alphabet that describes the conformation of fragments of four-residue length in a limited library of structural prototypes. Based on this concept, SCit uses sets of rotameric conformations dependent on the local backbone conformation of each protein for side chain positioning and the identification of side chains with unlikely conformations. The SCit web server is accessible at http://bioserv.rpbs.jussieu.fr/SCit. PMID:15215438

  9. Comparative analysis on arthroscopic sutures of large and extensive rotator cuff injuries in relation to the degree of osteopenia☆

    Science.gov (United States)

    Almeida, Alexandre; Atti, Vinícius; Agostini, Daniel Cecconi; Valin, Márcio Rangel; de Almeida, Nayvaldo Couto; Agostini, Ana Paula

    2015-01-01

    Objective To analyze the results from arthroscopic suturing of large and extensive rotator cuff injuries, according to the patient's degree of osteopenia. Method 138 patients who underwent arthroscopic suturing of large and extensive rotator cuff injuries between 2003 and 2011 were analyzed. Those operated from October 2008 onwards formed a prospective cohort, while the remainder formed a retrospective cohort. Also from October 2008 onwards, bone densitometry evaluation was requested at the time of the surgical treatment. For the patients operated before this date, densitometry examinations performed up to two years before or after the surgical treatment were investigated. The patients were divided into three groups. Those with osteoporosis formed group 1 (n = 16); those with osteopenia, group 2 (n = 33); and normal individuals, group 3 (n = 55). Results In analyzing the University of California at Los Angeles (UCLA) scores of group 3 and comparing them with group 2, no statistically significant difference was seen (p = 0.070). Analysis on group 3 in comparison with group 1 showed a statistically significant difference (p = 0.027). Conclusion The results from arthroscopic suturing of large and extensive rotator cuff injuries seem to be influenced by the patient's bone mineral density, as assessed using bone densitometry. PMID:26229899

  10. Comparative analysis on arthroscopic sutures of large and extensive rotator cuff injuries in relation to the degree of osteopenia

    Directory of Open Access Journals (Sweden)

    Alexandre Almeida

    2015-02-01

    Full Text Available OBJECTIVE: To analyze the results from arthroscopic suturing of large and extensive rotator cuff injuries, according to the patient's degree of osteopenia.METHOD: 138 patients who underwent arthroscopic suturing of large and extensive rotator cuff injuries between 2003 and 2011 were analyzed. Those operated from October 2008 onwards formed a prospective cohort, while the remainder formed a retrospective cohort. Also from October 2008 onwards, bone densitometry evaluation was requested at the time of the surgical treatment. For the patients operated before this date, densitometry examinations performed up to two years before or after the surgical treatment were investigated. The patients were divided into three groups. Those with osteoporosis formed group 1 (n = 16; those with osteopenia, group 2 (n = 33; and normal individuals, group 3 (n = 55.RESULTS: In analyzing the University of California at Los Angeles (UCLA scores of group 3 and comparing them with group 2, no statistically significant difference was seen (p = 0.070. Analysis on group 3 in comparison with group 1 showed a statistically significant difference (p = 0.027.CONCLUSION: The results from arthroscopic suturing of large and extensive rotator cuff injuries seem to be influenced by the patient's bone mineral density, as assessed using bone densitometry.

  11. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  12. Neutron activation analysis: a powerful tool in provenance investigations

    International Nuclear Information System (INIS)

    It is well known that neutron activation analysis (NAA), both instrumental and destructive, allows the simultaneous determination of a number of elements, mostly trace elements, with high levels of precision and accuracy. These peculiar properties of NAA are very useful when applied to provenance studies, i.e. to the identification of the origin of raw materials with which artifacts had been manufactured in ancient times. Data reduction by statistical procedures, especially multivariate analysis techniques, provides a statistical 'fingerprint' of investigated materials, both raw materials and archaeological artifacts, that, upon comparison, allows the identification of the provenance of prime matters used for artifact manufacturing. Thus information on quarries and flows exploitation in the antiquity, on technological raw materials processing, on trade routes and about the circulation of fakes, can be obtained. In the present paper two case studies are reported. The first one deals with the identification of the provenance of clay used to make ceramic materials, mostly bricks and tiles, recovered from the excavation of a Roman 'villa' in Lomello (Roman name Laumellum) and of Roman settlings in Casteggio (Roman name Clastidium). Both sites are located in the Province of Pavia in areas called Lomellina and Oltrepo respectively. The second one investigates the origin of the white marble used to build medieval arks, Carolingian age, located in the church of San Felice, now property of the University of Pavia. Experimental set-up, analytical results and data reduction procedures are presented and discussed. (author)

  13. Nested sampling as a tool for LISA data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gair, Jonathan R [Institute of Astronomy, Madingley Road, CB3 0HA, Cambridge (United Kingdom); Feroz, Farhan; Graff, Philip; Hobson, Michael P [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Babak, Stanislav; Petiteau, Antoine [Max-Planck-Institut fuer Gravitationsphysik, Am Muehlenberg 1, 14476, Potsdam (Germany); Porter, Edward K, E-mail: jgair@ast.cam.ac.u [APC, UMR 7164, Universite Paris 7 Denis Diderot, 10, rue Alice Domon et Leonie Duquet, 75205 Paris Cedex 13 (France)

    2010-05-01

    Nested sampling is a technique for efficiently computing the probability of a data set under a particular hypothesis, also called the Bayesian Evidence or Marginal Likelihood, and for evaluating the posterior. MULTINEST is a multi-modal nested sampling algorithm which has been designed to efficiently explore and characterize posterior probability surfaces containing multiple secondary solutions. We have applied the MULTINEST algorithm to a number of problems in gravitational wave data analysis. In this article, we describe the algorithm and present results for several applications of the algorithm to analysis of mock LISA data. We summarise recently published results for a test case in which we searched for two non-spinning black hole binary merger signals in simulated LISA data. We also describe results obtained with MULTINEST in the most recent round of the Mock LISA Data Challenge (MLDC), in which the algorithm was used to search for and characterise both spinning supermassive black hole binary inspirals and bursts from cosmic string cusps. In all these applications, the algorithm found the correct number of signals and efficiently recovered the posterior probability distribution. Moreover, in most cases the waveform corresponding to the best a-posteriori parameters had an overlap in excess of 99% with the true signal.

  14. Nested sampling as a tool for LISA data analysis

    International Nuclear Information System (INIS)

    Nested sampling is a technique for efficiently computing the probability of a data set under a particular hypothesis, also called the Bayesian Evidence or Marginal Likelihood, and for evaluating the posterior. MULTINEST is a multi-modal nested sampling algorithm which has been designed to efficiently explore and characterize posterior probability surfaces containing multiple secondary solutions. We have applied the MULTINEST algorithm to a number of problems in gravitational wave data analysis. In this article, we describe the algorithm and present results for several applications of the algorithm to analysis of mock LISA data. We summarise recently published results for a test case in which we searched for two non-spinning black hole binary merger signals in simulated LISA data. We also describe results obtained with MULTINEST in the most recent round of the Mock LISA Data Challenge (MLDC), in which the algorithm was used to search for and characterise both spinning supermassive black hole binary inspirals and bursts from cosmic string cusps. In all these applications, the algorithm found the correct number of signals and efficiently recovered the posterior probability distribution. Moreover, in most cases the waveform corresponding to the best a-posteriori parameters had an overlap in excess of 99% with the true signal.

  15. MATING DESIGNS: HELPFUL TOOL FOR QUANTITATIVE PLANT BREEDING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Athanase Nduwumuremyi

    2013-12-01

    Full Text Available Selection of parental materials and good mating designs in conventional plant breeding are the keys to the successful plant breeding programme. However, there are several factors affecting the choices of mating designs. Mating design refers to the procedure of producing the progenies, in plant breeding, plant breeders and geneticists, theoretically and practically, they use different form of mating designs and arrangements for targeted purpose. The choice of a mating design for estimating genetic variances should be dictated by the objectives of the study, time, space, cost and other biological limitations. In all mating designs, the individuals are taken randomly and crossed to produce progenies which are related to each other as half-sibs or full-sibs. A form of multivariate analysis or the analysis of variance can be adopted to estimate the components of variances. Therefore, this review aimed at highlighting the most used mating design in plant breeding and genetics studies. It provides easy and quick insight of the different form of mating designs and some statistical components for successful plant breeding.

  16. IBIXFIT: A Tool For The Analysis Of Microcalorimeter PIXE Spectra

    International Nuclear Information System (INIS)

    PIXE analysis software has been for long mainly tuned to the needs of Si(Li) detector based spectra analysis and quantification methods based on Kα or Lα X-ray lines. Still, recent evidences related to the study of relative line intensities and new developments in detection equipment, namely the emergence of commercial microcalorimeter based X-ray detectors, have brought up the possibility that in the near future PIXE will become more than just major lines quantification. A main issue that became evident as a consequence of this was the need to be able to fit PIXE spectra without prior knowledge of relative line intensities. Considering new developments it may be necessary to generalize PIXE to a wider notion of ion beam induced X-ray (IBIX) emission, to include the quantification of processes such as Radiative Auger Emission. In order to answer to this need, the IBIXFIT code was created based much on the Bayesian Inference and Simulated Annealing routines implemented in the Datafurnace code [1]. In this presentation, the IBIXFIT is used to fit a microcalorimeter spectrum of a BaxSr(1-x)TiO3 thin film sample and the specific possibility of selecting between fixed and free line ratios combined with other specificities of the IBIXFIT algorithm are shown to be essential to overcome the problems faced.

  17. Model for nuclear proliferation resistance analysis using decision making tools

    International Nuclear Information System (INIS)

    The nuclear proliferation risks of nuclear fuel cycles is being considered as one of the most important factors in assessing advanced and innovative nuclear systems in GEN IV and INPRO program. They have been trying to find out an appropriate and reasonable method to evaluate quantitatively several nuclear energy system alternatives. Any reasonable methodology for integrated analysis of the proliferation resistance, however, has not yet been come out at this time. In this study, several decision making methods, which have been used in the situation of multiple objectives, are described in order to see if those can be appropriately used for proliferation resistance evaluation. Especially, the AHP model for quantitatively evaluating proliferation resistance is dealt with in more detail. The theoretical principle of the method and some examples for the proliferation resistance problem are described. For more efficient applications, a simple computer program for the AHP model is developed, and the usage of the program is introduced here in detail. We hope that the program developed in this study could be useful for quantitative analysis of the proliferation resistance involving multiple conflict criteria

  18. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    OpenAIRE

    ASLANTAŞ, Kubilay

    2003-01-01

    The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutt...

  19. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  20. Assessment of transport parameters in a karst system under various flow periods through extensive analysis of artificial tracer tests

    Science.gov (United States)

    Doummar, J.; Margane, A.; Sauter, M.; Geyer, T.

    2012-04-01

    It is primordial to understand the sensibility of a catchment or a spring against contamination to secure a sustainable water resource management in karst aquifers. Artificial tracer tests have proven to be excellent tools for the simulation of contaminant transport within an aquifer before its arrival at a karst spring as they provide information about transit times, dispersivities and therefore insights into the vulnerability of a water body against contamination (Geyer et al. 2007). For this purpose, extensive analysis of artificial tracer tests was undertaken in the following work, in order to acquire conservative transport parameters along fast and slow pathways in a mature karst system under various flow conditions. In the framework of the project "Protection of Jeita Spring" (BGR), about 30 tracer tests were conducted on the catchment area of the Jeita spring in Lebanon (Q= 1 to 20 m3/s) under various flow conditions and with different injection points (dolines, sinkholes, subsurface, and underground channel). Tracer breakthrough curves (TBC) observed at karst springs and in the conduit system were analyzed using the two-region non-equilibrium approach (2NREM) (Toride & van Genuchten 1999). The approach accounts for the skewness in the TBCs long tailings, which cannot be described with one dimensional advective-dispersive transport models (Geyer et al. 2007). Relationships between the modeling parameters estimated from the TBC were established under various flow periods. Rating curves for velocity and discharge show that the flow velocity increases with spring discharge. The calibrated portion of the immobile region in the conduit system is relatively low. Estimated longitudinal dispersivities in the conduit system range between 7 and 10 m in high flow periods and decreases linearly with increasing flow. In low flow periods, this relationship doesn't hold true as longitudinal dispersivities range randomly between 4 and 7 m. The longitudinal dispersivity