WorldWideScience

Sample records for analysis tool applied

  1. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  2. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  3. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    CERN Document Server

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  4. Coating-substrate-simulations applied to HFQ® forming tools

    Directory of Open Access Journals (Sweden)

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  5. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  6. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  7. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  8. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  9. Strategic decision analysis applied to borehole seismology

    International Nuclear Information System (INIS)

    Menke, M.M.; Paulsson, B.N.P.

    1994-01-01

    Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

  10. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  11. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  12. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  13. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  14. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  15. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  16. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  17. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  18. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  19. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  20. Investigation of Micro Square Structure Fabrication by Applying Textured Cutting Tool in WEDM

    Directory of Open Access Journals (Sweden)

    Jianguo Zhang

    2015-09-01

    Full Text Available This paper studies micro structure fabrication by means of a textured tool cutting edge, which is manufactured by applying the wire cut electrical discharge machining (WEDM. Machining performance of the square structure fabrication on the tool cutting edge is investigated in the WEDM process, and the machining accuracy is explored in experimental analyses. In this proposed method, undesired overcut comes from the discharge between the processing debris and the side wall of the target structure. Furthermore, by applying the textured cutting tool, the target square structure is directly fabricated on the alumina workpiece with just a simple turning process, which verifies the feasibility of the proposed tool cutting edge textured method by applying the WEDM. This technology is expected to become a potential method for the mass production of micro structure surfaces in the future.

  1. Big Data Tools as Applied to ATLAS Event Data

    Science.gov (United States)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling

  2. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  3. Tools and methodologies applied to eLearning

    OpenAIRE

    Seoane Pardo, Antonio M.; García-Peñalvo, Francisco José

    2006-01-01

    The aim of this paper is to show how eLearning technologies and methodologies should be useful for teaching and researching Logic. Firstly, a definition and explanation of eLearning and its main modalities will be given. Then, the most important elements and tools of eLearning activities will be shown. Finally, we will give three suggestions to improve learning experience with eLearning applied to Logic. Se muestran diversas tecnologías y metodologías de e-learning útiles en la enseñanza e...

  4. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  5. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  6. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  7. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  8. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    Science.gov (United States)

    2016-09-01

    tool has been developed for many platforms: Android , iOS, and Windows. The Windows version has been developed as a web server that allows the...Microsoft Windows. 15. SUBJECT TERMS Applied Anomaly Detection Tool, AADT, Windows, server, web service, installation 16. SECURITY CLASSIFICATION OF: 17...instructional information about identifying them as groups and individually. The software has been developed for several different platforms: Android

  9. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  10. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  11. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  12. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  13. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  14. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  15. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  16. Possibilities of Applying Video Surveillance and other ICT Tools and Services in the Production Process

    Directory of Open Access Journals (Sweden)

    Adis Rahmanović

    2018-02-01

    Full Text Available The paper presents the possibilities of applying Video surveillance and other ICT tools and services in the production process. The first part of the paper presented the system for controlling video surveillance for and the given opportunity of application of video surveillance for the security of the employees and the assets. In the second part of the paper an analysis of the system for controlling production is given and then a video surveillance of a work excavator. The next part of the paper presents integration of video surveillance and the accompanying tools. At the end of the paper, suggestions were also given for further works in the field of data protection and cryptography in video surveillance use.

  17. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  18. Anthology of the Development of Radiation Transport Tools as Applied to Single Event Effects

    Science.gov (United States)

    Reed, R. A.; Weller, R. A.; Akkerman, A.; Barak, J.; Culpepper, W.; Duzellier, S.; Foster, C.; Gaillardin, M.; Hubert, G.; Jordan, T.; Jun, I.; Koontz, S.; Lei, F.; McNulty, P.; Mendenhall, M. H.; Murat, M.; Nieminen, P.; O'Neill, P.; Raine, M.; Reddell, B.; Saigné, F.; Santin, G.; Sihver, L.; Tang, H. H. K.; Truscott, P. R.; Wrobel, F.

    2013-06-01

    This anthology contains contributions from eleven different groups, each developing and/or applying Monte Carlo-based radiation transport tools to simulate a variety of effects that result from energy transferred to a semiconductor material by a single particle event. The topics span from basic mechanisms for single-particle induced failures to applied tasks like developing websites to predict on-orbit single event failure rates using Monte Carlo radiation transport tools.

  19. Simple Strategic Analysis Tools at SMEs in Ecuador

    Directory of Open Access Journals (Sweden)

    Diego H. Álvarez Peralta

    2015-06-01

    Full Text Available This article explores the possible applications of Strategic Analysis Tools (SAT in SMEs located in emerging countries such as Ecuador (where there are no formal studies on the subject. It is intended to analyze if whether or not it is feasible to effectively apply a set of proposed tools to guide mental map decisions of executives when decisions on strategy have to be made. Through an in-depth review of the state of the art in regards to SAT and interviews performed to main participants such as chambers and executives of different firms, it is shown the feasibility of their application. This analysis is complemented with specialists´ interviews to deepen our insights and obtaining valid conclusions. Our conclusion is that SMEs can smoothly develop and apply an appropriate set of SAT when opting for very relevant choices. However, there are some inconveniences to be solved which are connected with resources (such as peoples’ abilities and technology and behavioral (cultural factors and methodological processes.Once these barriers are knocked down, it would be more likely to enrich current approaches to make strategic decisions even more effective. This is a qualitative investigation and the research design is not experimental (among them it is transversal as it relates to a specific moment in time.

  20. Anthology of the development of radiation transport tools as applied to single event effects

    International Nuclear Information System (INIS)

    Akkerman, A.; Barak, J.; Murat, M.; Duzellier, S.; Hubert, G.; Gaillardin, M.; Raine, M.; Jordan, T.; Jun, I.; Koontz, S.; Reddell, B.; O'Neill, P.; Foster, C.; Culpepper, W.; Lei, F.; McNulty, P.; Nieminen, P.; Saigne, F.; Wrobel, F.; Santin, G.; Sihver, L.; Tang, H.H.K.; Truscott, P.R.

    2013-01-01

    This anthology contains contributions from eleven different groups, each developing and/or applying Monte Carlo-based radiation transport tools to simulate a variety of effects that result from energy transferred to a semiconductor material by a single particle event. The topics span from basic mechanisms for single-particle induced failures to applied tasks like developing web sites to predict on-orbit single event failure rates using Monte Carlo radiation transport tools. (authors)

  1. Operations management tools to be applied for textile

    Science.gov (United States)

    Maralcan, A.; Ilhan, I.

    2017-10-01

    In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.

  2. Applying open source data visualization tools to standard based medical data.

    Science.gov (United States)

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  3. Analysis of Brick Masonry Wall using Applied Element Method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  4. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  5. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  6. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  7. The 7 basic tools of quality applied to radiological safety

    International Nuclear Information System (INIS)

    Gonzalez F, J.A.

    1991-01-01

    This work seeks to establish a series of correspondences among the search of the quality and the optimization of the doses received by the occupationally exposed personnel. There are treated about the seven basic statistic tools of the quality: the Pareto technique, Cause effect diagrams, Stratification, Verification sheet, Histograms, Dispersion diagrams and Graphics and control frames applied to the Radiological Safety

  8. Applied mediation analyses

    DEFF Research Database (Denmark)

    Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke

    2017-01-01

    In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart...... disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation...

  9. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  10. Channel CAT: A Tactical Link Analysis Tool

    Science.gov (United States)

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  11. Apply Functional Modelling to Consequence Analysis in Supervision Systems

    DEFF Research Database (Denmark)

    Zhang, Xinxin; Lind, Morten; Gola, Giulio

    2013-01-01

    This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains...... consequence analysis to practical or online applications in supervision systems. It will also suggest a multiagent solution as the integration architecture for developing tools to facilitate the utilization results of functional consequence analysis. Finally a prototype of the multiagent reasoning system...... causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for consequence reasoning by using rule base technology and describe the challenges for integrating functional...

  12. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  13. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    Science.gov (United States)

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  14. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak

    2018-01-01

    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  15. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  16. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  17. Analysis tools for discovering strong parity violation at hadron colliders

    Science.gov (United States)

    Backović, Mihailo; Ralston, John P.

    2011-07-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  18. Analysis tools for discovering strong parity violation at hadron colliders

    International Nuclear Information System (INIS)

    Backovic, Mihailo; Ralston, John P.

    2011-01-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or 'azimuthal flow'. Analysis uses the representations of the orthogonal group O(2) and dihedral groups D N necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single 'reaction plane'. Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of 'event-shape sorting' to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  19. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  20. Applied linear algebra and matrix analysis

    CERN Document Server

    Shores, Thomas S

    2018-01-01

    In its second edition, this textbook offers a fresh approach to matrix and linear algebra. Its blend of theory, computational exercises, and analytical writing projects is designed to highlight the interplay between these aspects of an application. This approach places special emphasis on linear algebra as an experimental science that provides tools for solving concrete problems. The second edition’s revised text discusses applications of linear algebra like graph theory and network modeling methods used in Google’s PageRank algorithm. Other new materials include modeling examples of diffusive processes, linear programming, image processing, digital signal processing, and Fourier analysis. These topics are woven into the core material of Gaussian elimination and other matrix operations; eigenvalues, eigenvectors, and discrete dynamical systems; and the geometrical aspects of vector spaces. Intended for a one-semester undergraduate course without a strict calculus prerequisite, Applied Linear Algebra and M...

  1. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Science.gov (United States)

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for

  2. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    Directory of Open Access Journals (Sweden)

    Carlos Alejandro Robles-Rubio

    Full Text Available Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA. POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP signals; (ii RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii a library of data segments representing each of the 6 patterns; (iv a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness. Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential

  3. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  4. Applying HAZOP analysis in assessing remote handling compatibility of ITER port plugs

    International Nuclear Information System (INIS)

    Duisings, L.P.M.; Til, S. van; Magielsen, A.J.; Ronden, D.M.S.; Elzendoorn, B.S.Q.; Heemskerk, C.J.M.

    2013-01-01

    Highlights: ► We applied HAZOP analysis to assess the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. ► We identified several weak points in the general upper port plug maintenance concept. ► We made clear recommendations on redesign in port plug design, operational sequence and Hot Cell equipment. ► The use of a HAZOP approach for the ECH UL port can also be applied to ITER port plugs in general. -- Abstract: This paper describes the application of a Hazard and Operability Analysis (HAZOP) methodology in assessing the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. As part of the ECHUL consortium, the remote handling team at the DIFFER Institute is developing maintenance tools and procedures for critical components of the ECH Upper launcher (UL). Based on NRG's experience with nuclear risk analysis and Hot Cell procedures, early versions of these tool concepts and maintenance procedures were subjected to a HAZOP analysis. The analysis identified several weak points in the general upper port plug maintenance concept and led to clear recommendations on redesigns in port plug design, the operational sequence and ITER Hot Cell equipment. The paper describes the HAZOP methodology and illustrates its application with specific procedures: the Steering Mirror Assembly (SMA) replacement and the exchange of the Mid Shield Optics (MSO) in the ECH UPL. A selection of recommended changes to the launcher design associated with the accessibility, maintainability and manageability of replaceable components are presented

  5. RankProdIt: A web-interactive Rank Products analysis tool

    Directory of Open Access Journals (Sweden)

    Laing Emma

    2010-08-01

    Full Text Available Abstract Background The first objective of a DNA microarray experiment is typically to generate a list of genes or probes that are found to be differentially expressed or represented (in the case of comparative genomic hybridizations and/or copy number variation between two conditions or strains. Rank Products analysis comprises a robust algorithm for deriving such lists from microarray experiments that comprise small numbers of replicates, for example, less than the number required for the commonly used t-test. Currently, users wishing to apply Rank Products analysis to their own microarray data sets have been restricted to the use of command line-based software which can limit its usage within the biological community. Findings Here we have developed a web interface to existing Rank Products analysis tools allowing users to quickly process their data in an intuitive and step-wise manner to obtain the respective Rank Product or Rank Sum, probability of false prediction and p-values in a downloadable file. Conclusions The online interactive Rank Products analysis tool RankProdIt, for analysis of any data set containing measurements for multiple replicated conditions, is available at: http://strep-microarray.sbs.surrey.ac.uk/RankProducts

  6. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  7. A Development of the Calibration Tool Applied on Analog I/O Modules for Safety-related Controller

    International Nuclear Information System (INIS)

    Kim, Jong-Kyun; Yun, Dong-Hwa; Lee, Myeong-Kyun; Yoo, Kwan-Woo

    2016-01-01

    The purpose of this paper is to develop the calibration tool for analog input/output(I/O) modules. Those modules are components in POSAFE-Q which is a programmable logic controller(PLC) that has been developed for the evaluation of safety-related. In this paper, performance improvement of analog I/O modules is presented by developing and applying the calibration tool for each channel in analog I/O modules. With this tool, the input signal to an analog input module and the output signal from an analog output module are able to be satisfied with a reference value of sensor type and an accuracy of all modules. With RS-232 communication, the manual calibration tool is developed for analog I/O modules of an existing and up-to-date version in POSAFE-Q PLC. As a result of applying this tool, the converted value is performant for a type of input sensor and an accuracy of analog I/O modules

  8. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  9. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  10. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  11. Development of a Method for Tool Wear Analysis Using 3D Scanning

    Directory of Open Access Journals (Sweden)

    Hawryluk Marek

    2017-12-01

    Full Text Available The paper deals with evaluation of a 3D scanning method elaborated by the authors, by applying it to the analysis of the wear of forging tools. The 3D scanning method in the first place consists in the application of scanning to the analysis of changes in geometry of a forging tool by way of comparing the images of a worn tool with a CAD model or an image of a new tool. The method was evaluated in the context of the important measurement problems resulting from the extreme conditions present during the industrial hot forging processes. The method was used to evaluate wear of tools with an increasing wear degree, which made it possible to determine the wear characteristics in a function of the number of produced forgings. The following stage was the use it for a direct control of the quality and geometry changes of forging tools (without their disassembly by way of a direct measurement of the geometry of periodically collected forgings (indirect method based on forgings. The final part of the study points to the advantages and disadvantages of the elaborated method as well as the potential directions of its further development.

  12. Reliability of Lactation Assessment Tools Applied to Overweight and Obese Women.

    Science.gov (United States)

    Chapman, Donna J; Doughty, Katherine; Mullin, Elizabeth M; Pérez-Escamilla, Rafael

    2016-05-01

    The interrater reliability of lactation assessment tools has not been evaluated in overweight/obese women. This study aimed to compare the interrater reliability of 4 lactation assessment tools in this population. A convenience sample of 45 women (body mass index > 27.0) was videotaped while breastfeeding (twice daily on days 2, 4, and 7 postpartum). Three International Board Certified Lactation Consultants independently rated each videotaped session using 4 tools (Infant Breastfeeding Assessment Tool [IBFAT], modified LATCH [mLATCH], modified Via Christi [mVC], and Riordan's Tool [RT]). For each day and tool, we evaluated interrater reliability with 1-way repeated-measures analyses of variance, intraclass correlation coefficients (ICCs), and percentage absolute agreement between raters. Analyses of variance showed significant differences between raters' scores on day 2 (all scales) and day 7 (RT). Intraclass correlation coefficient values reflected good (mLATCH) to excellent reliability (IBFAT, mVC, and RT) on days 2 and 7. All day 4 ICCs reflected good reliability. The ICC for mLATCH was significantly lower than all others on day 2 and was significantly lower than IBFAT (day 7). Percentage absolute interrater agreement for scale components ranged from 31% (day 2: observable swallowing, RT) to 92% (day 7: IBFAT, fixing; and mVC, latch time). Swallowing scores on all scales had the lowest levels of interrater agreement (31%-64%). We demonstrated differences in the interrater reliability of 4 lactation assessment tools when applied to overweight/obese women, with the lowest values observed on day 4. Swallowing assessment was particularly unreliable. Researchers and clinicians using these scales should be aware of the differences in their psychometric behavior. © The Author(s) 2015.

  13. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  14. Structural analysis of ITER sub-assembly tools

    International Nuclear Information System (INIS)

    Nam, K.O.; Park, H.K.; Kim, D.J.; Ahn, H.J.; Lee, J.H.; Kim, K.K.; Im, K.; Shaw, R.

    2011-01-01

    The ITER Tokamak assembly tools are purpose-built assembly tools to complete the ITER Tokamak machine which includes the cryostat and the components contained therein. The sector sub-assembly tools descried in this paper are main assembly tools to assemble vacuum vessel, thermal shield and toroidal filed coils into a complete 40 o sector. The 40 o sector sub-assembly tools are composed of sector sub-assembly tool, including radial beam, vacuum vessel supports and mid-plane brace tools. These tools shall have sufficient strength to transport and handle heavy weight of the ITER Tokamak machine reached several hundred tons. Therefore these tools should be designed and analyzed to confirm both the strength and structural stability even in the case of conservative assumptions. To verify structural stabilities of the sector sub-assembly tools in terms of strength and deflection, ANSYS code was used for linear static analysis. The results of the analysis show that these tools are designed with sufficient strength and stiffness. The conceptual designs of these tools are briefly described in this paper also.

  15. The RUBA Watchdog Video Analysis Tool

    DEFF Research Database (Denmark)

    Bahnsen, Chris Holmberg; Madsen, Tanja Kidholm Osmann; Jensen, Morten Bornø

    We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA.......We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA....

  16. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  17. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  18. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  19. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  20. Enrichr: interactive and collaborative HTML5 gene list enrichment analysis tool.

    Science.gov (United States)

    Chen, Edward Y; Tan, Christopher M; Kou, Yan; Duan, Qiaonan; Wang, Zichen; Meirelles, Gabriela Vaz; Clark, Neil R; Ma'ayan, Avi

    2013-04-15

    System-wide profiling of genes and proteins in mammalian cells produce lists of differentially expressed genes/proteins that need to be further analyzed for their collective functions in order to extract new knowledge. Once unbiased lists of genes or proteins are generated from such experiments, these lists are used as input for computing enrichment with existing lists created from prior knowledge organized into gene-set libraries. While many enrichment analysis tools and gene-set libraries databases have been developed, there is still room for improvement. Here, we present Enrichr, an integrative web-based and mobile software application that includes new gene-set libraries, an alternative approach to rank enriched terms, and various interactive visualization approaches to display enrichment results using the JavaScript library, Data Driven Documents (D3). The software can also be embedded into any tool that performs gene list analysis. We applied Enrichr to analyze nine cancer cell lines by comparing their enrichment signatures to the enrichment signatures of matched normal tissues. We observed a common pattern of up regulation of the polycomb group PRC2 and enrichment for the histone mark H3K27me3 in many cancer cell lines, as well as alterations in Toll-like receptor and interlukin signaling in K562 cells when compared with normal myeloid CD33+ cells. Such analyses provide global visualization of critical differences between normal tissues and cancer cell lines but can be applied to many other scenarios. Enrichr is an easy to use intuitive enrichment analysis web-based tool providing various types of visualization summaries of collective functions of gene lists. Enrichr is open source and freely available online at: http://amp.pharm.mssm.edu/Enrichr.

  1. Estimation of the Tool Condition by Applying the Wavelet Transform to Acoustic Emission Signals

    International Nuclear Information System (INIS)

    Gomez, M. P.; Piotrkowski, R.; Ruzzante, J. E.; D'Attellis, C. E.

    2007-01-01

    This work follows the search of parameters to evaluate the tool condition in machining processes. The selected sensing technique is acoustic emission and it is applied to a turning process of steel samples. The obtained signals are studied using the wavelet transformation. The tool wear level is quantified as a percentage of the final wear specified by the Standard ISO 3685. The amplitude and relevant scale obtained of acoustic emission signals could be related with the wear level

  2. Measuring the Effects of Trade Liberalization: Multilevel Analysis Tool for Agriculture

    OpenAIRE

    Gerard, Francoise; Marty, Isabelle; Lancon, Frederic; Versapuech, Marion

    1998-01-01

    This book is the product of the projects "Farmers' Strategies Regarding Agricultural Diversification" (1993-1995) and "Agricultural Diversification and Food Crop Trade: Their Implications to Agricultural Policies in Southeast Asia" (1994-1996), both supported by CIRAD and the Government of France. It describes the first project attempting to apply the MATA methodology in a country like Indonesia. As constructed, the Multilevel Analysis Tool for Agriculture is able to answer various policy que...

  3. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  4. A Method to Optimize Geometric Errors of Machine Tool based on SNR Quality Loss Function and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Cai Ligang

    2017-01-01

    Full Text Available Instead improving the accuracy of machine tool by increasing the precision of key components level blindly in the production process, the method of combination of SNR quality loss function and machine tool geometric error correlation analysis to optimize five-axis machine tool geometric errors will be adopted. Firstly, the homogeneous transformation matrix method will be used to build five-axis machine tool geometric error modeling. Secondly, the SNR quality loss function will be used for cost modeling. And then, machine tool accuracy optimal objective function will be established based on the correlation analysis. Finally, ISIGHT combined with MATLAB will be applied to optimize each error. The results show that this method is reasonable and appropriate to relax the range of tolerance values, so as to reduce the manufacturing cost of machine tools.

  5. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  6. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  7. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  8. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    Science.gov (United States)

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. New QC 7 tools

    International Nuclear Information System (INIS)

    1982-03-01

    This book tells of new QC with 7 tools which includes TQC and new QC with 7 tools which is for better propel, what is QC method to think? what is new QC 7 tool ? like KJ law, PDPC law, arrow and diagram law, and matrix diagram law, application of new QC 7 tools such as field to apply, application of new QC 7 tools for policy management the method of new QC 7 tools including related regulations KJ law, matrix and data analysis, PDPC law and education and introduction of new QC 7 tools.

  10. Analysis of logging data from nuclear borehole tools

    International Nuclear Information System (INIS)

    Hovgaard, J.; Oelgaard, P.L.

    1989-12-01

    The processing procedure for logging data from a borehole of the Stenlille project of Dansk Naturgas A/S has been analysed. The tools considered in the analysis were an integral, natural-gamma tool, a neutron porosity tool, a gamma density tool and a caliper tool. It is believed that in most cases the processing procedure used by the logging company in the interpretation of the raw data is fully understood. An exception is the epithermal part of the neutron porosity tool where all data needed for an interpretation were not available. The analysis has shown that some parts of the interpretation procedure may not be consistent with the physical principle of the tools. (author)

  11. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  12. The CANDU alarm analysis tool (CAAT)

    Energy Technology Data Exchange (ETDEWEB)

    Davey, E C; Feher, M P; Lupton, L R [Control Centre Technology Branch, ON (Canada)

    1997-09-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs.

  13. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  14. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care.

    Science.gov (United States)

    Bowie, Paul; McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested "guiding tools" based on human factors principles. Mixed-methods development of guiding tools (Personal Booklet-to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad-to guide a team-based systems analysis; and a written Report Format) by a multiprofessional "expert" group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement.

  15. Affordances of agricultural systems analysis tools

    NARCIS (Netherlands)

    Ditzler, Lenora; Klerkx, Laurens; Chan-Dentoni, Jacqueline; Posthumus, Helena; Krupnik, Timothy J.; Ridaura, Santiago López; Andersson, Jens A.; Baudron, Frédéric; Groot, Jeroen C.J.

    2018-01-01

    The increasingly complex challenges facing agricultural systems require problem-solving processes and systems analysis (SA) tools that engage multiple actors across disciplines. In this article, we employ the theory of affordances to unravel what tools may furnish users, and how those affordances

  16. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  17. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  18. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  19. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  20. Analysis of the concept of nursing educational technology applied to the patient

    Directory of Open Access Journals (Sweden)

    Aline Cruz Esmeraldo Áfio

    2014-04-01

    Full Text Available It is aimed at analyzing the concept of educational technology, produced by nursing, applied to the patient. Rodgers´ Evolutionary Method of Concept Analysis was used, identifying background, attributes and consequential damages. 13 articles were selected for analysis in which the background was identified: knowledge deficiency, shortage of nursing professionals' time, to optimize nursing work, the need to achieve the goals of the patients. Attributes: tool, strategy, innovative approach, pedagogical approach, mediator of knowledge, creative way to encourage the acquisition of skills, health production instrument. Consequences: to improve the quality of life, encouraging healthy behavior, empowerment, reflection and link. It emphasizes the importance of educational technologies for the care in nursing, to boost health education activities.

  1. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  2. AQME: A forensic mitochondrial DNA analysis tool for next-generation sequencing data.

    Science.gov (United States)

    Sturk-Andreaggi, Kimberly; Peck, Michelle A; Boysen, Cecilie; Dekker, Patrick; McMahon, Timothy P; Marshall, Charla K

    2017-11-01

    The feasibility of generating mitochondrial DNA (mtDNA) data has expanded considerably with the advent of next-generation sequencing (NGS), specifically in the generation of entire mtDNA genome (mitogenome) sequences. However, the analysis of these data has emerged as the greatest challenge to implementation in forensics. To address this need, a custom toolkit for use in the CLC Genomics Workbench (QIAGEN, Hilden, Germany) was developed through a collaborative effort between the Armed Forces Medical Examiner System - Armed Forces DNA Identification Laboratory (AFMES-AFDIL) and QIAGEN Bioinformatics. The AFDIL-QIAGEN mtDNA Expert, or AQME, generates an editable mtDNA profile that employs forensic conventions and includes the interpretation range required for mtDNA data reporting. AQME also integrates an mtDNA haplogroup estimate into the analysis workflow, which provides the analyst with phylogenetic nomenclature guidance and a profile quality check without the use of an external tool. Supplemental AQME outputs such as nucleotide-per-position metrics, configurable export files, and an audit trail are produced to assist the analyst during review. AQME is applied to standard CLC outputs and thus can be incorporated into any mtDNA bioinformatics pipeline within CLC regardless of sample type, library preparation or NGS platform. An evaluation of AQME was performed to demonstrate its functionality and reliability for the analysis of mitogenome NGS data. The study analyzed Illumina mitogenome data from 21 samples (including associated controls) of varying quality and sample preparations with the AQME toolkit. A total of 211 tool edits were automatically applied to 130 of the 698 total variants reported in an effort to adhere to forensic nomenclature. Although additional manual edits were required for three samples, supplemental tools such as mtDNA haplogroup estimation assisted in identifying and guiding these necessary modifications to the AQME-generated profile. Along

  3. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.

  4. Applied Meteorology Unit (AMU)

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark

    2010-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the first quarter of Fiscal Year 2010 (October - December 2009). A detailed project schedule is included in the Appendix. Included tasks are: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool, Phase III, (3) Peak Wind Tool for General Forecasting, Phase II, (4) Upgrade Summer Severe Weather Tool in Meteorological Interactive Data Display System (MIDDS), (5) Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS) Update and Maintainability, (5) Verify 12-km resolution North American Model (MesoNAM) Performance, and (5) Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) Graphical User Interface.

  5. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  6. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  7. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    Lenzi, Bruno

    2012-01-01

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  8. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    on applied CA, the application of basic CA's principles, methods, and findings to the study of social domains and practices that are interactionally constituted. We consider three strands—foundational, social problem oriented, and institutional applied CA—before turning to recent developments in CA research...... on learning and development. In conclusion, we address some emerging themes in the relationship of CA and applied linguistics, including the role of multilingualism, standard social science methods as research objects, CA's potential for direct social intervention, and increasing efforts to complement CA......For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...

  9. Digital Elevation Profile: A Complex Tool for the Spatial Analysis of Hiking

    Directory of Open Access Journals (Sweden)

    Laura TÎRLĂ

    2014-11-01

    Full Text Available One of the current attributions of mountain geomorphology is to provide information for tourism purposes, such as the spatial analysis of hiking trails. Therefore, geomorphic tools are indispensable for terrain analyses. Elevation profile is one of the most adequate tools for assessing the morphometric patterns of the hiking trails. In this study we tested several applications in order to manage raw data, create profile graphs and obtain the morphometric parameters of five hiking trails in the Căpățânii Mountains (South Carpathians, Romania. Different data complexity was explored: distance, elevation, cumulative gain or loss, slope etc. Furthermore, a comparative morphometric analysis was performed in order to emphasize the multiple possibilities provided by the elevation profile. Results show that GPS Visualizer, Geocontext and in some manner Google Earth are the most adequate applications that provide high-quality elevation profiles and detailed data, with multiple additional functions, according to user's needs. The applied tools and techniques are very useful for mountain route planning, elaborating mountain guides, enhancing knowledge about specific trails or routes, or assessing the landscape and tourism value of a mountain area.

  10. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  11. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  12. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  13. Applying reliability analysis to design electric power systems for More-electric aircraft

    Science.gov (United States)

    Zhang, Baozhu

    The More-Electric Aircraft (MEA) is a type of aircraft that replaces conventional hydraulic and pneumatic systems with electrically powered components. These changes have significantly challenged the aircraft electric power system design. This thesis investigates how reliability analysis can be applied to automatically generate system topologies for the MEA electric power system. We first use a traditional method of reliability block diagrams to analyze the reliability level on different system topologies. We next propose a new methodology in which system topologies, constrained by a set reliability level, are automatically generated. The path-set method is used for analysis. Finally, we interface these sets of system topologies with control synthesis tools to automatically create correct-by-construction control logic for the electric power system.

  14. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  15. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  16. HYDROLOGIC AND FEATURE-BASED SURFACE ANALYSIS FOR TOOL MARK INVESTIGATION ON ARCHAEOLOGICAL FINDS

    Directory of Open Access Journals (Sweden)

    K. Kovács

    2012-07-01

    Full Text Available The improvement of detailed surface documentation methods provides unique tool mark-study opportunities in the field of archaeological researches. One of these data collection techniques is short-range laser scanning, which creates a digital copy of the object’s morphological characteristics from high-resolution datasets. The aim of our work was the accurate documentation of a Bronze Age sluice box from Mitterberg, Austria with a spatial resolution of 0.2 mm. Furthermore, the investigation of the entirely preserved tool marks on the surface of this archaeological find was also accomplished by these datasets. The methodology of this tool mark-study can be summarized in the following way: At first, a local hydrologic analysis has been applied to separate the various patterns of tools on the finds’ surface. As a result, the XYZ coordinates of the special points, which represent the edge lines of the sliding tool marks, were calculated by buffer operations in a GIS environment. During the second part of the workflow, these edge points were utilized to manually clip the triangle meshes of these patterns in reverse engineering software. Finally, circle features were generated and analysed to determine the different sections along these sliding tool marks. In conclusion, the movement of the hand tool could be reproduced by the spatial analysis of the created features, since the horizontal and vertical position of the defined circle centre points indicated the various phases of the movements. This research shows an exact workflow to determine the fine morphological structures on the surface of the archaeological find.

  17. A risk assessment tool applied to the study of shale gas resources

    Energy Technology Data Exchange (ETDEWEB)

    Veiguela, Miguel [Mining, Energy and Materials Engineering School, University of Oviedo (Spain); Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando [Environment Department, CIEMAT, Madrid (Spain); Roqueñi, Nieves [Mining, Energy and Materials Engineering School, University of Oviedo (Spain); Loredo, Jorge, E-mail: jloredo@uniovi.es [Mining, Energy and Materials Engineering School, University of Oviedo (Spain)

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's ‘Screening and Ranking Framework (SRF)’ developed to evaluate potential geologic carbon dioxide (CO{sub 2}) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. - Highlights: • The proposed methodology is a risk assessment useful tool for shale gas projects. • The tool is addressed to the early stages of decision making processes. • The risk assessment of a site is made through a qualitative estimation. • Different weights are assigned to each specific natural and technological property. • The uncertainty associated to the current knowledge is considered.

  18. A risk assessment tool applied to the study of shale gas resources

    International Nuclear Information System (INIS)

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-01-01

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's ‘Screening and Ranking Framework (SRF)’ developed to evaluate potential geologic carbon dioxide (CO_2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. - Highlights: • The proposed methodology is a risk assessment useful tool for shale gas projects. • The tool is addressed to the early stages of decision making processes. • The risk assessment of a site is made through a qualitative estimation. • Different weights are assigned to each specific natural and technological property. • The uncertainty associated to the current knowledge is considered.

  19. Applying Triz for Production Quality Improvement

    Directory of Open Access Journals (Sweden)

    Swee Nikalus Shu Luing

    2017-01-01

    Full Text Available This paper aims to provide a thorough analysis on the application of TRIZ in improving the quality of canned food production. TRIZ tools such as engineering systems analysis, function analysis, cause and effect chain analysis, By-separation model and 40 Inventive Principles are applied in order to discover some feasible and elegant solutions to alleviate the problem. Findings revealed that the rejected canned products on the conveyor belt will be isolated or picked up with other good condition canned products which are lined up very closely to the rejected cans; though the visioning system is able detect the fault printing on the canned product. The main root cause is that the rejected canned product is picked up with other canned products in good condition because all cans are lined up on the belt and are very close to each other or having no gaps between the cans. Conversely, all cans on the conveyor belts are required to be very close to each other to avoid collisions that may damage the cans. The root cause is solved by applying function analysis, By-separation tool and Inventive Principles. Therefore, it can be concluded that TRIZ is a powerful tool in inventive problem solving.

  20. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  1. First GIS analysis of modern stone tools used by wild chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa.

    Science.gov (United States)

    Benito-Calvo, Alfonso; Carvalho, Susana; Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record.

  2. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  3. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  4. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  5. Concept analysis of culture applied to nursing.

    Science.gov (United States)

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  6. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    International Nuclear Information System (INIS)

    Wingle, W.L.; Poeter, E.P.; McKenna, S.A.

    1999-01-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  7. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  8. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  9. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  10. Post-Flight Data Analysis Tool

    Science.gov (United States)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  11. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  12. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  13. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.

    2006-01-01

    be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots......We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...

  14. A developmental screening tool for toddlers with multiple domains based on Rasch analysis.

    Science.gov (United States)

    Hwang, Ai-Wen; Chou, Yeh-Tai; Hsieh, Ching-Lin; Hsieh, Wu-Shiun; Liao, Hua-Fang; Wong, Alice May-Kuen

    2015-01-01

    Using multidomain developmental screening tools is a feasible method for pediatric health care professionals to identify children at risk of developmental problems in multiple domains simultaneously. The purpose of this study was to develop a Rasch-based tool for Multidimensional Screening in Child Development (MuSiC) for children aged 0-3 years. The MuSic was developed by constructing items bank based on three commonly used screening tools, validating with developmental status (at risk for delay or not) on five developmental domains. Parents of a convenient sample of 632 children (aged 3-35.5 months) with and without developmental delays responded to items from the three screening tools funded by health authorities in Taiwan. Item bank was determined by item fit of Rasch analysis for each of the five developmental domains (cognitive skills, language skills, gross motor skills, fine motor skills, and socioadaptive skills). Children's performance scores in logits derived in Rasch analysis were validated with developmental status for each domain using the area under receiver operating characteristic curves. MuSiC, a 75-item developmental screening tool for five domains, was derived. The diagnostic validity of all five domains was acceptable for all stages of development, except for the infant stage (≤11 months and 15 days). MuSiC can be applied simultaneously to well-child care visits as a universal screening tool for children aged 1-3 years on multiple domains. Items with sound validity for infants need to be further developed. Copyright © 2014. Published by Elsevier B.V.

  15. Assimilation of tourism satellite accounts and applied general equilibrium models to inform tourism policy analysis

    OpenAIRE

    Rossouw, Riaan; Saayman, Melville

    2011-01-01

    Historically, tourism policy analysis in South Africa has posed challenges to accurate measurement. The primary reason for this is that tourism is not designated as an 'industry' in standard economic accounts. This paper therefore demonstrates the relevance and need for applied general equilibrium (AGE) models to be completed and extended through an integration with tourism satellite accounts (TSAs) as a tool for policy makers (especially tourism policy makers) in South Africa. The paper sets...

  16. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  17. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  18. Electronic tools for health information exchange: an evidence-based analysis.

    Science.gov (United States)

    2013-01-01

    As patients experience transitions in care, there is a need to share information between care providers in an accurate and timely manner. With the push towards electronic medical records and other electronic tools (eTools) (and away from paper-based health records) for health information exchange, there remains uncertainty around the impact of eTools as a form of communication. To examine the impact of eTools for health information exchange in the context of care coordination for individuals with chronic disease in the community. A literature search was performed on April 26, 2012, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published until April 26, 2012 (no start date limit was applied). A systematic literature search was conducted, and meta-analysis conducted where appropriate. Outcomes of interest fell into 4 categories: health services utilization, disease-specific clinical outcomes, process-of-care indicators, and measures of efficiency. The quality of the evidence was assessed individually for each outcome. Expert panels were assembled for stakeholder engagement and contextualization. Eleven articles were identified (4 randomized controlled trials and 7 observational studies). There was moderate quality evidence of a reduction in hospitalizations, hospital length of stay, and emergency department visits following the implementation of an electronically generated laboratory report with recommendations based on clinical guidelines. The evidence showed no difference in disease-specific outcomes; there was no evidence of a positive impact on process-of-care indicators or measures of efficiency. A limited body of research specifically examined eTools for health information exchange in the population and setting of interest. This evidence included a

  19. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  20. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  1. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  2. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  3. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  4. First GIS analysis of modern stone tools used by wild chimpanzees (Pan troglodytes verus in Bossou, Guinea, West Africa.

    Directory of Open Access Journals (Sweden)

    Alfonso Benito-Calvo

    Full Text Available Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry, and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils, which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record.

  5. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  6. Two-dimensional gap analysis: a tool for efficient conservation planning and biodiversity policy implementation.

    Science.gov (United States)

    Angelstam, Per; Mikusiński, Grzegorz; Rönnbäck, Britt-Inger; Ostman, Anders; Lazdinis, Marius; Roberge, Jean-Michel; Arnberg, Wolter; Olsson, Jan

    2003-12-01

    The maintenance of biodiversity by securing representative and well-connected habitat networks in managed landscapes requires a wise combination of protection, management, and restoration of habitats at several scales. We suggest that the integration of natural and social sciences in the form of "Two-dimensional gap analysis" is an efficient tool for the implementation of biodiversity policies. The tool links biologically relevant "horizontal" ecological issues with "vertical" issues related to institutions and other societal issues. Using forest biodiversity as an example, we illustrate how one can combine ecological and institutional aspects of biodiversity conservation, thus facilitating environmentally sustainable regional development. In particular, we use regional gap analysis for identification of focal forest types, habitat modelling for ascertaining the functional connectivity of "green infrastructures", as tools for the horizontal gap analysis. For the vertical dimension we suggest how the social sciences can be used for assessing the success in the implementation of biodiversity policies in real landscapes by identifying institutional obstacles while implementing policies. We argue that this interdisciplinary approach could be applied in a whole range of other environments including other terrestrial biota and aquatic ecosystems where functional habitat connectivity, nonlinear response to habitat loss and a multitude of economic and social interests co-occur in the same landscape.

  7. Applying the accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Barbalat, Oscar

    1989-12-15

    Originally developed as tools for frontier physics, particle accelerators provide valuable spinoff benefits in applied research and technology. These accelerator applications are the subject of a biennial meeting in Denton, Texas, but the increasing activity in this field resulted this year (5-9 September) in the first European Conference on Accelerators in Applied Research and Technology, organized by K. Bethge of Frankfurt's Goethe University. The meeting reflected a wide range of applications - ion beam analysis, exploitation of nuclear microbeams, accelerator mass spectrometry, applications of photonuclear reactions, ion beam processing, synchrotron radiation for semiconductor technology, specialized technology.

  8. Applying the accelerator

    International Nuclear Information System (INIS)

    Barbalat, Oscar

    1989-01-01

    Originally developed as tools for frontier physics, particle accelerators provide valuable spinoff benefits in applied research and technology. These accelerator applications are the subject of a biennial meeting in Denton, Texas, but the increasing activity in this field resulted this year (5-9 September) in the first European Conference on Accelerators in Applied Research and Technology, organized by K. Bethge of Frankfurt's Goethe University. The meeting reflected a wide range of applications - ion beam analysis, exploitation of nuclear microbeams, accelerator mass spectrometry, applications of photonuclear reactions, ion beam processing, synchrotron radiation for semiconductor technology, specialized technology

  9. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  10. Rock models at Zielona Gora, Poland applied to the semi-empirical neutron tool calibration

    International Nuclear Information System (INIS)

    Czubek, J.A.; Ossowski, A.; Zorski, T.; Massalski, T.

    1995-01-01

    The semi-empirical calibration method applied to the neutron porosity tool is presented in this paper. It was used with the ODSN-102 tool of 70 mm diameter and equipped with an Am-Be neutron source at the calibration facility of Zielona Gora, Poland, inside natural and artificial rocks: four sandstone, four limestone and one dolomite block with borehole diameters of 143 and 216 mm, and three artificial ceramic blocks with borehole diameters of 90 and 180 mm. All blocks were saturated with fresh water, and fresh water was also inside all boreholes. In five blocks mineralized water (200,000 ppm NaCl) was introduced inside the boreholes. All neutron characteristics of the calibration blocks are given in this paper. The semi-empirical method of calibration correlates the tool readings observed experimentally with the general neutron parameter (GNP). This results in a general calibration curve, where the tool readings (TR) vs GNP are situated at one curve irrespective of their origin, i.e. of the formation lithology, borehole diameter, tool stand-off, brine salinity, etc. The n and m power coefficients are obtained experimentally during the calibration procedure. The apparent neutron parameters are defined as those sensed by a neutron tool situated inside the borehole and in real environmental conditions. When they are known, the GNP parameter can be computed analytically for the whole range of porosity at any kind of borehole diameter, formation lithology (including variable rock matrix absorption cross-section and density), borehole and formation salinity, tool stand-off and drilling fluid physical parameters. By this approach all porosity corrections with respect to the standard (e.g. limestone) calibration curve can be generated. (author)

  11. Building an applied activation analysis centre

    International Nuclear Information System (INIS)

    Bartosek, J.; Kasparec, I.; Masek, J.

    1972-01-01

    Requirements are defined and all available background material is reported and discussed for the building up of a centre of applied activation analysis in Czechoslovakia. A detailed analysis of potential users and the centre's envisaged availability is also presented as part of the submitted study. A brief economic analysis is annexed. The study covers the situation up to the end of 1972. (J.K.)

  12. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  13. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.

  14. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  15. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  16. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  17. Review of hardware cost estimation methods, models and tools applied to early phases of space mission planning

    Science.gov (United States)

    Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.

    2012-08-01

    of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.

  18. Analysis of the interaction between experimental and applied behavior analysis.

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  19. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  20. Lithology and mineralogy recognition from geochemical logging tool data using multivariate statistical analysis.

    Science.gov (United States)

    Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques

    2017-10-01

    The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Multilayers quantitative X-ray fluorescence analysis applied to easel paintings.

    Science.gov (United States)

    de Viguerie, Laurence; Sole, V Armando; Walter, Philippe

    2009-12-01

    X-ray fluorescence spectrometry (XRF) allows a rapid and simple determination of the elemental composition of a material. As a non-destructive tool, it has been extensively used for analysis in art and archaeology since the early 1970s. Whereas it is commonly used for qualitative analysis, recent efforts have been made to develop quantitative treatment even with portable systems. However, the interpretation of the results obtained with this technique can turn out to be problematic in the case of layered structures such as easel paintings. The use of differential X-ray attenuation enables modelling of the various layers: indeed, the absorption of X-rays through different layers will result in modification of intensity ratio between the different characteristic lines. This work focuses on the possibility to use XRF with the fundamental parameters method to reconstruct the composition and thickness of the layers. This method was tested on several multilayers standards and gives a maximum error of 15% for thicknesses and errors of 10% for concentrations. On a painting test sample that was rather inhomogeneous, the XRF analysis provides an average value. This method was applied in situ to estimate the thickness of the layers a painting from Marco d'Oggiono, pupil of Leonardo da Vinci.

  2. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  3. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  4. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  5. Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis

    International Nuclear Information System (INIS)

    Hussain, T; Gondal, M A

    2013-01-01

    Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.

  6. Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis

    Science.gov (United States)

    Hussain, T.; Gondal, M. A.

    2013-06-01

    Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.

  7. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  8. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  9. Physics analysis tools for beauty physics in ATLAS

    International Nuclear Information System (INIS)

    Anastopoulos, C; Bouhova-Thacker, E; Catmore, J; Mora, L de; Dallison, S; Derue, F; Epp, B; Jussel, P; Kaczmarska, A; Radziewski, H v; Stahl, T; Reznicek, P

    2008-01-01

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools

  10. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  11. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    International Nuclear Information System (INIS)

    Brignon, Jean-Marc

    2011-01-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial 'socially' performs in comparison with its alternatives. 'Industrial economics' methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a 'pragmatic regulatory impact analysis', that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is 'pragmatic' in the sense that it is driven by the purpose to assess 'what happens' with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  12. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    Science.gov (United States)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  13. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools

  14. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  15. Risk analysis tools for force protection and infrastructure/asset protection

    International Nuclear Information System (INIS)

    Jaeger, C.D.; Duggan, R.A.; Paulus, W.K.

    1998-01-01

    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework

  16. Management of specific and excessive posturing behavior in a hyacinth macaw (Anodorhynchus hyacinthinus) by using applied behavior analysis.

    Science.gov (United States)

    Clayton, Leigh Ann; Friedman, Susan G; Evans, Liz A

    2012-06-01

    Applied behavior analysis was used in a female hyacinth macaw (Anodorhynchus hyacinthinus) to reduce specific, excessive mating-type posturing that had become disruptive due to increased frequency, duration, and intensity. A functional assessment and intervention design worksheet was used to evaluate behavior-environment relations and to develop an individualized behavior-change plan. The functional assessment indicated that human attention was maintaining the behavior. The intervention, differential reinforcement of incompatible behavior, was implemented to increase attention for standing upright and to remove attention for posturing. Within 1 month, posturing decreased to acceptable levels and was replaced with an upright posture. Problem behaviors that appear "reproductive" may be responsive to behavior management alone. Applied behavior analysis and a functional assessment and intervention design are ideal tools to address problem behavior in avian patients.

  17. Surface Enhanced Raman Spectroscopy (SERS) and multivariate analysis as a screening tool for detecting Sudan I dye in culinary spices

    Science.gov (United States)

    Di Anibal, Carolina V.; Marsal, Lluís F.; Callao, M. Pilar; Ruisánchez, Itziar

    2012-02-01

    Raman spectroscopy combined with multivariate analysis was evaluated as a tool for detecting Sudan I dye in culinary spices. Three Raman modalities were studied: normal Raman, FT-Raman and SERS. The results show that SERS is the most appropriate modality capable of providing a proper Raman signal when a complex matrix is analyzed. To get rid of the spectral noise and background, Savitzky-Golay smoothing with polynomial baseline correction and wavelet transform were applied. Finally, to check whether unadulterated samples can be differentiated from samples adulterated with Sudan I dye, an exploratory analysis such as principal component analysis (PCA) was applied to raw data and data processed with the two mentioned strategies. The results obtained by PCA show that Raman spectra need to be properly treated if useful information is to be obtained and both spectra treatments are appropriate for processing the Raman signal. The proposed methodology shows that SERS combined with appropriate spectra treatment can be used as a practical screening tool to distinguish samples suspicious to be adulterated with Sudan I dye.

  18. Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process

    Science.gov (United States)

    Ray, Paul S.

    2002-01-01

    Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.

  19. Data-base tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    The authors use a commercial data-base software package to create several data-base products that enhance the ability of experimental physicists to analyze data from the TMX-U experiment. This software resides on a Dec-20 computer in M-Divisions's user service center (USC), where data can be analyzed separately from the main acquisition computers. When these data-base tools are combined with interactive data analysis programs, physicists can perform automated (batch-style) processing or interactive data analysis on the computers in the USC or on the supercomputers of the NMFECC, in addition to the normal processing done on the acquisition system. One data-base tool provides highly reduced data for searching and correlation analysis of several diagnostic signals for a single shot or many shots. A second data-base tool provides retrieval and storage of unreduced data for detailed analysis of one or more diagnostic signals. The authors report how these data-base tools form the core of an evolving off-line data-analysis environment on the USC computers

  20. THE CASE STUDY TASKS AS A BASIS FOR THE FUND OF THE ASSESSMENT TOOLS AT THE MATHEMATICAL ANALYSIS FOR THE DIRECTION 01.03.02 APPLIED MATHEMATICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Dina Aleksandrovna Kirillova

    2015-12-01

    Full Text Available The modern reform of the Russian higher education involves the implementation of competence-based approach, the main idea of which is the practical orientation of education. Mathematics is a universal language of description, modeling and studies of phenomena and processes of different nature. Therefore creating the fund of assessment tools for mathematical disciplines based on the applied problems is actual. The case method is the most appropriate mean of monitoring the learning outcomes, it is aimed at bridging the gap between theory and practice.The aim of the research is the development of methodical materials for the creating the fund of assessment tools that are based on the case-study for the mathematical analisis for direction «Applied Mathematics and Computer Science». The aim follows from the contradiction between the need for the introduction of case-method in the educational process in high school and the lack of study of the theoretical foundations of using of this method as applied to mathematical disciplines, insufficient theoretical basis and the description of the process of creating case-problems for use their in the monitoring of the learning outcomes.

  1. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  2. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Science.gov (United States)

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  3. Lessons learned in applying function analysis

    International Nuclear Information System (INIS)

    Mitchel, G.R.; Davey, E.; Basso, R.

    2001-01-01

    This paper summarizes the lessons learned in undertaking and applying function analysis based on the recent experience of utility, AECL and international design and assessment projects. Function analysis is an analytical technique that can be used to characterize and asses the functions of a system and is widely recognized as an essential component of a 'systematic' approach to design, on that integrated operational and user requirements into the standard design process. (author)

  4. Applying homotopy analysis method for solving differential-difference equation

    International Nuclear Information System (INIS)

    Wang Zhen; Zou Li; Zhang Hongqing

    2007-01-01

    In this Letter, we apply the homotopy analysis method to solving the differential-difference equations. A simple but typical example is applied to illustrate the validity and the great potential of the generalized homotopy analysis method in solving differential-difference equation. Comparisons are made between the results of the proposed method and exact solutions. The results show that the homotopy analysis method is an attractive method in solving the differential-difference equations

  5. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  6. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  7. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  8. Applying the Case Management CourTools: Finding from an Urban Trial Court

    Directory of Open Access Journals (Sweden)

    Collins E. Ijoma

    2012-06-01

    Full Text Available The National Center for State Courts (NCSC recently promulgated 10 trial court performance measures, referred to as CourTools. Measures 2, 3, 4, and 5 provide a methodology by which court managers can examine their management and processing of cases. The measures include clearance rate (measure 2, time to disposition (measure 3, age of active pending caseload (measure 4, and trial date certainty (measure 5. The objective of this research was threefold. The first aim was to assess the viability of using the case management measures to examine case processing trends in a New Jersey (NJ urban trial court. Each measure was reviewed to determine the tool’s applicability to the criminal division of the court. The second objective (pursued as a parallel to the first was to present the findings in the same context as the CourTools’ framework to determine its practicality. The final goal was to serve as a platform for other courts on the national and international level that do not yet use performance measures. These courts, diverse as they are, may use the methodologies and findings of this case study as a reference and guide to develop their own program to measure the court’s productivity and efficiency. To that end, this case study sought to answer the following questions in determining the applicability of the CourTools to the selected court and by extension, its potential for more universal application to other court systems. First, what is the relevance of measurements to the courts and why is it important, if at all? Second, what are the CourTools? Third, can the measurement model be applied to an actual court and if so, how is it executed and illustrated in practice? Finally, what are the implications of the findings for the court in question, as well as, other courts that seek to incorporate the CourTools to measure performance?

  9. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  10. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    Science.gov (United States)

    Dai, Wensheng

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740

  11. Applying different independent component analysis algorithms and support vector regression for IT chain store sales forecasting.

    Science.gov (United States)

    Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  12. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    Directory of Open Access Journals (Sweden)

    Wensheng Dai

    2014-01-01

    Full Text Available Sales forecasting is one of the most important issues in managing information technology (IT chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR, is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA, temporal ICA (tICA, and spatiotemporal ICA (stICA to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  13. Barriers and facilitators for implementing a new screening tool in an emergency department: A qualitative study applying the Theoretical Domains Framework.

    Science.gov (United States)

    Kirk, Jeanette W; Sivertsen, Ditte M; Petersen, Janne; Nilsen, Per; Petersen, Helle V

    2016-10-01

    The aim was to identify the factors that were perceived as most important as facilitators or barriers to the introduction and intended use of a new tool in the emergency department among nurses and a geriatric team. A high incidence of functional decline after hospitalisation for acute medical illness has been shown in the oldest patients and those who are physically frail. In Denmark, more than 35% of older medical patients acutely admitted to the emergency department are readmitted within 90 days after discharge. A new screening tool for use in the emergency department aiming to identify patients at particularly high risk of functional decline and readmission was developed. Qualitative study based on semistructured interviews with nurses and a geriatric team in the emergency department and semistructured single interviews with their managers. The Theoretical Domains Framework guided data collection and analysis. Content analysis was performed whereby new themes and themes already existing within each domain were described. Six predominant domains were identified: (1) professional role and identity; (2) beliefs about consequences; (3) goals; (4) knowledge; (5) optimism and (6) environmental context and resources. The content analysis identified three themes, each containing two subthemes. The themes were professional role and identity, beliefs about consequences and preconditions for a successful implementation. Two different cultures were identified in the emergency department. These cultures applied to different professional roles and identity, different actions and sense making and identified how barriers and facilitators linked to the new screening tool were perceived. The results show that different cultures exist in the same local context and influence the perception of barriers and facilitators differently. These cultures must be identified and addressed when implementation is planned. © 2016 The Authors. Journal of Clinical Nursing Published by John

  14. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  15. Big Data tools as applied to ATLAS event data

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2017-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and to...

  16. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  17. Draper Station Analysis Tool

    Science.gov (United States)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  18. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  19. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States

    Directory of Open Access Journals (Sweden)

    Min-Uk Kim

    2018-05-01

    Full Text Available We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA tools. We used OCA tools Korea Offsite Risk Assessment (KORA and Areal Location of Hazardous Atmospheres (ALOHA in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH3, 35% hydrogen chloride (HCl, 50% hydrofluoric acid (HF, and 69% nitric acid (HNO3. The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  20. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    Science.gov (United States)

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  1. Database tools for enhanced analysis of TMX-U data. Revision 1

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  2. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  3. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    Science.gov (United States)

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  4. C++ software quality in the ATLAS experiment: tools and experience

    Science.gov (United States)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  5. C++ software quality in the ATLAS experiment: tools and experience

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00236968; The ATLAS collaboration; Kluth, Stefan; Seuster, Rolf; Snyder, Scott; Obreshkov, Emil; Roe, Shaun; Sherwood, Peter; Stewart, Graeme

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  6. Developing a tool for assessing competency in root cause analysis.

    Science.gov (United States)

    Gupta, Priyanka; Varkey, Prathibha

    2009-01-01

    Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same. A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future. All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal. Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

  7. Network Analysis Tools: from biological networks to clusters and pathways.

    Science.gov (United States)

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  8. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    Science.gov (United States)

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-05-15

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  10. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  11. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  12. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  13. Network analysis as a tool for assessing environmental sustainability: applying the ecosystem perspective to a Danish water management system

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Scotti, Marco; Thomsen, Marianne

    2013-01-01

    New insights into the sustainable use of natural resources in human systems can be gained through comparison with ecosystems via common indices. In both kinds of system, resources are processed by a number of users within a network, but we consider ecosystems as the only ones displaying sustainable...... patterns of growth and development. We applied Network Analysis (NA) for assessing the sustainability of a Danish municipal Water Management System (WMS). We identified water users within the WMS and represented their interactions as a network of water flows. We computed intensive and extensive indices...

  14. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  15. Stream analysis, a practical tool for innovators and change agents

    NARCIS (Netherlands)

    Kastelein, A.

    1993-01-01

    To survive organizations have to innovate and to change. Fundamental questions are: * Which strategies and tools could be applied succesfully in the everchanging environment? * Are the identified instruments effective iri our business? * Do we need professional support? In more than a dozen projects

  16. Geo-environmental mapping tool applied to pipeline design

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Karina de S.; Calle, Jose A.; Gil, Euzebio J. [Geomecanica S/A Tecnologia de Solo Rochas e Materiais, Rio de Janeiro, RJ (Brazil); Sare, Alexandre R. [Geomechanics International Inc., Houston, TX (United States); Soares, Ana Cecilia [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The Geo-Environmental Mapping is an improvement of the Geological-Geotechnical Mapping used for basic pipeline designs. The main purpose is to assembly the environmental, geotechnical and geological concepts in a methodological tool capable to predict constrains and reduce the pipeline impact to the environment. The Geo-Environmental mapping was built to stress the influence of soil/structure interaction, related to the physical effect that comes from the contact between structures and soil or rock. A Geological-Geotechnical-Environmental strip (chart) was presented to emphasize the pipeline operational constrains and its influence to the environment. The mapping was developed to clearly show the occurrence and properties of geological materials divided into geotechnical domain units (zones). The strips present construction natural properties, such as: excavability, stability of the excavation and soil re-use capability. Also, the environmental constrains were added to the geological-geotechnical mapping. The Geo-Environmental Mapping model helps the planning of the geotechnical and environmental inquiries to be carried out during executive design, the discussion on the types of equipment to be employed during construction and the analysis of the geological risks and environmental impacts to be faced during the built of the pipeline. (author)

  17. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  18. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  19. The application of systems thinking concepts, methods, and tools to global health practices: An analysis of case studies.

    Science.gov (United States)

    Wilkinson, Jessica; Goff, Morgan; Rusoja, Evan; Hanson, Carl; Swanson, Robert Chad

    2018-06-01

    This review of systems thinking (ST) case studies seeks to compile and analyse cases from ST literature and provide practitioners with a reference for ST in health practice. Particular attention was given to (1) reviewing the frequency and use of key ST terms, methods, and tools in the context of health, and (2) extracting and analysing longitudinal themes across cases. A systematic search of databases was conducted, and a total of 36 case studies were identified. A combination of integrative and inductive qualitative approaches to analysis was used. Most cases identified took place in high-income countries and applied ST retrospectively. The most commonly used ST terms were agent/stakeholder/actor (n = 29), interdependent/interconnected (n = 28), emergence (n = 26), and adaptability/adaptation (n = 26). Common ST methods and tools were largely underutilized. Social network analysis was the most commonly used method (n = 4), and innovation or change management history was the most frequently used tool (n = 11). Four overarching themes were identified; the importance of the interdependent and interconnected nature of a health system, characteristics of leaders in a complex adaptive system, the benefits of using ST, and barriers to implementing ST. This review revealed that while much has been written about the potential benefits of applying ST to health, it has yet to completely transition from theory to practice. There is however evidence of the practical use of an ST lens as well as specific methods and tools. With clear examples of ST applications, the global health community will be better equipped to understand and address key health challenges. © 2017 John Wiley & Sons, Ltd.

  20. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  1. The use of case tools in OPG safety analysis code qualification

    International Nuclear Information System (INIS)

    Pascoe, J.; Cheung, A.; Westbye, C.

    2001-01-01

    Ontario Power Generation (OPG) is currently qualifying its critical safety analysis software. The software quality assurance (SQA) framework is described. Given the legacy nature of much of the safety analysis software the reverse engineering methodology has been adopted. The safety analysis suite of codes was developed over a period of many years to differing standards of quality and had sparse or incomplete documentation. Key elements of the reverse engineering process require recovery of design information from existing coding. This recovery, if performed manually, could represent an enormous effort. Driven by a need to maximize productivity and enhance the repeatability and objectivity of software qualification activities the decision was made to acquire or develop and implement Computer Aided Software Engineering (CASE) tools. This paper presents relevant background information on CASE tools and discusses how the OPG SQA requirements were used to assess the suitability of available CASE tools. Key findings from the application of CASE tools to the qualification of the OPG safety analysis software are discussed. (author)

  2. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    Science.gov (United States)

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  3. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis.

    Science.gov (United States)

    Kuleesha, Yadav; Puah, Wee Choo; Lin, Feng; Wasser, Martin

    2014-01-01

    During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. We designed a new tool to

  4. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis

    Science.gov (United States)

    2014-01-01

    Background During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. Results We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis

  5. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  6. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  7. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  8. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  9. BIOIMPEDANCE VECTOR ANALYSIS AS A TOOL FOR DETERMINATION AND ADJUSTMENT OF DRY WEIGHT IN HEMODIALYSIS PATIENTS

    OpenAIRE

    Ximena Atilano; José Luis.Miguel; Jorge Martínez; Rafael Sánchez; Rafael Selgas

    2012-01-01

    The hemodialysis (HD) patient is fluid overloaded, even when there is no apparent edema. Due to this, is vital to know the dry weight. No clinical or laboratory parameters are reliable, simple and accessible for this purpose. The bioelectrical impedance has been applied to estimate body fluids and dry weight. The purpose was to use the bioelectrical vector analysis (BIVA) as a tool to adjust the intensity of ultrafiltration and achievement of dry weight in HD patients. We performed monthly me...

  10. Automated patterning and probing with multiple nanoscale tools for single-cell analysis.

    Science.gov (United States)

    Li, Jiayao; Kim, Yeonuk; Liu, Boyin; Qin, Ruwen; Li, Jian; Fu, Jing

    2017-10-01

    The nano-manipulation approach that combines Focused Ion Beam (FIB) milling and various imaging and probing techniques enables researchers to investigate the cellular structures in three dimensions. Such fusion approach, however, requires extensive effort on locating and examining randomly-distributed targets due to limited Field of View (FOV) when high magnification is desired. In the present study, we present the development that automates 'pattern and probe' particularly for single-cell analysis, achieved by computer aided tools including feature recognition and geometric planning algorithms. Scheduling of serial FOVs for imaging and probing of multiple cells was considered as a rectangle covering problem, and optimal or near-optimal solutions were obtained with the heuristics developed. FIB milling was then employed automatically followed by downstream analysis using Atomic Force Microscopy (AFM) to probe the cellular interior. Our strategy was applied to examine bacterial cells (Klebsiella pneumoniae) and achieved high efficiency with limited human interference. The developed algorithms can be easily adapted and integrated with different imaging platforms towards high-throughput imaging analysis of single cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  12. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  13. A Benchmarking Analysis of Open-Source Business Intelligence Tools in Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Andreia Brandão

    2016-10-01

    Full Text Available In recent years, a wide range of Business Intelligence (BI technologies have been applied to different areas in order to support the decision-making process. BI enables the extraction of knowledge from the data stored. The healthcare industry is no exception, and so BI applications have been under investigation across multiple units of different institutions. Thus, in this article, we intend to analyze some open-source/free BI tools on the market and their applicability in the clinical sphere, taking into consideration the general characteristics of the clinical environment. For this purpose, six BI tools were selected, analyzed, and tested in a practical environment. Then, a comparison metric and a ranking were defined for the tested applications in order to choose the one that best applies to the extraction of useful knowledge and clinical data in a healthcare environment. Finally, a pervasive BI platform was developed using a real case in order to prove the tool viability.

  14. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    Science.gov (United States)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  15. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  16. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  17. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    Science.gov (United States)

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  18. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  19. INTERFACING INTERACTIVE DATA ANALYSIS TOOLS WITH THE GRID: THE PPDG CS-11 ACTIVITY

    International Nuclear Information System (INIS)

    Perl, Joseph

    2003-01-01

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services''. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed

  20. Laser-induced breakdown spectroscopy applied to the characterization of rock by support vector machine combined with principal component analysis

    International Nuclear Information System (INIS)

    Yang Hong-Xing; Fu Hong-Bo; Wang Hua-Dong; Jia Jun-Wei; Dong Feng-Zhong; Sigrist, Markus W

    2016-01-01

    Laser-induced breakdown spectroscopy (LIBS) is a versatile tool for both qualitative and quantitative analysis. In this paper, LIBS combined with principal component analysis (PCA) and support vector machine (SVM) is applied to rock analysis. Fourteen emission lines including Fe, Mg, Ca, Al, Si, and Ti are selected as analysis lines. A good accuracy (91.38% for the real rock) is achieved by using SVM to analyze the spectroscopic peak area data which are processed by PCA. It can not only reduce the noise and dimensionality which contributes to improving the efficiency of the program, but also solve the problem of linear inseparability by combining PCA and SVM. By this method, the ability of LIBS to classify rock is validated. (paper)

  1. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  2. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    Science.gov (United States)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency

  3. Finite Element Analysis as a response to frequently asked questions of machine tool mechanical design-engineers

    Directory of Open Access Journals (Sweden)

    Kehl Gerhard

    2017-01-01

    Full Text Available The finite element analysis (FEA nowadays is indispensable in the product development of machining centres and production machinery for metal cutting processes. It enables extensive static, dynamic and thermal simulation of digital prototypes of machine tools before production start-up. But until now less reflection has been made about what are the most pressing questions to be answered in this application field, with the intention to align the modelling and simulation methods with substantial requirements. Based on 3D CAD geometry data for a modern machining centre (Deckel-Maho-Gildemeister DMG 635 V eco merely the basic steps of a static analysis are reconstructed by FEA. Particularly the two most frequently asked questions by the design departments of machine tool manufacturers are discussed and highlighted. For this authentic simulation results are used, at which their selection is a consequence of long lasting experience in the industrial application of FEA in the design process chain. Noticing that such machine tools are mechatronic systems applying a considerable number of actuators, sensors and controllers in addition to mechanical structures, the answers to those core questions are required for design enhancement, to save costs and to improve the productivity and the quality of machined workpieces.

  4. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  5. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  6. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  7. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  8. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  9. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  10. ICT Tools of Professional Teacher Activity: A Comparative Analysis of Russian and European Experience

    Directory of Open Access Journals (Sweden)

    Tatiana N.

    2018-03-01

    Full Text Available Introduction: electronic, distance and blended educational technologies are actively used in modern teaching and learning process. The relevance of the study is predetermined by the necessity to consolidate teachers’ competencies in the field of ICT tools. The purpose of the article is to study and compare the competences of Russian and European teachers in using pedagogical ICT tools. Materials and Methods: comparison and analysis of domestic and foreign pedagogical practices are used. Data was obtained with the help of elaborated questionnaires for teachers with sufficient experience in the use of ICT. Results: the results of a comparative analysis of data characterising the experience of pedagogical ICT tools application by teachers of Russian and foreign universities are presented. Similar trends and problem areas were identified. They relate both to the use of information technology and electronic educational resources and to the variability of the educational opportunities. The obtained results show that the educational request of students in the electronic environment is not always sufficiently recognised and taken into account by teachers. The revealed general directions of research in the area of ICT tools application in teaching activity indicate the tendencies of the integration of the Russian and European experience into the global information and educational space. Discussion and Conclusions: in summary, Russian and foreign teachers have similar competencies in the use of educational ICT tools. They apply the tools to the learning process with varying intensity depending on the experience of distance educational services implementation, the policy of an educational institution, and the awareness of the blended learning specifics. The practical significance of the results it the following: firstly, the directions that need to be strengthened in vocational training programs for future and practicing teachers are identified; secondly

  11. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  12. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  13. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    Science.gov (United States)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  14. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools).

    Science.gov (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2016-06-01

    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Participatory tools working with crops, varieties and seeds. A guide for professionals applying participatory approaches in agrobiodiversity management, crop improvement and seed sector development

    NARCIS (Netherlands)

    Boef, de W.S.; Thijssen, M.H.

    2007-01-01

    Outline to the guide Within our training programmes on local management of agrobiodiversity, participatory crop improvement and the support of local seed supply participatory tools get ample attention. Tools are dealt with theoretically, are practised in class situations, but are also applied in

  16. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  17. The Significance of Regional Analysis in Applied Geography.

    Science.gov (United States)

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  18. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    Science.gov (United States)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  19. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  20. Applied Meteorology Unit (AMU) Quarterly Report - Fourth Quarter FY-09

    Science.gov (United States)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Wheeler, Mark

    2009-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the fourth quarter of Fiscal Year 2009 (July - September 2009). Tasks reports include: (1) Peak Wind Tool for User Launch Commit Criteria (LCC), (2) Objective Lightning Probability Tool. Phase III, (3) Peak Wind Tool for General Forecasting. Phase II, (4) Update and Maintain Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS), (5) Verify MesoNAM Performance (6) develop a Graphical User Interface to update selected parameters for the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLlT)

  1. Impacts of Integrated Marketing Communication Strategies Applied for Geographical Indications on Purchasing Behavior

    OpenAIRE

    Kırgız, Ayça

    2017-01-01

    The purpose of this study is to raise the awarenessfor products with geographical indication (GI) which directly influence thedevelopment of local and nation-wide economies, marketing of tourism activitiesand branding of destination and to investigate the integrated marketing communication(IMC) tools applied for realization of selling and the impact of such tools onshopping behavior. In this study, simple linear regression analysis have beenused. The data analysis showed that the perceived qu...

  2. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  3. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  4. Applied decision analysis and risk evaluation

    International Nuclear Information System (INIS)

    Ferse, W.; Kruber, S.

    1995-01-01

    During 1994 the workgroup 'Applied Decision Analysis and Risk Evaluation; continued the work on the knowledge based decision support system XUMA-GEFA for the evaluation of the hazard potential of contaminated sites. Additionally a new research direction was started which aims at the support of a later stage of the treatment of contaminated sites: The clean-up decision. For the support of decisions arising at this stage, the methods of decision analysis will be used. Computational aids for evaluation and decision support were implemented and a case study at a waste disposal site in Saxony which turns out to be a danger for the surrounding groundwater ressource was initiated. (orig.)

  5. A model of integration among prediction tools: applied study to road freight transportation

    Directory of Open Access Journals (Sweden)

    Henrique Dias Blois

    Full Text Available Abstract This study has developed a scenery analysis model which has integrated decision-making tools on investments: prospective scenarios (Grumbach Method and systems dynamics (hard modeling, with the innovated multivariate analysis of experts. It was designed through analysis and simulation scenarios and showed which are the most striking events in the study object as well as highlighted the actions could redirect the future of the analyzed system. Moreover, predictions are likely to be developed through the generated scenarios. The model has been validated empirically with road freight transport data from state of Rio Grande do Sul, Brazil. The results showed that the model contributes to the analysis of investment because it identifies probabilities of events that impact on decision making, and identifies priorities for action, reducing uncertainties in the future. Moreover, it allows an interdisciplinary discussion that correlates different areas of knowledge, fundamental when you wish more consistency in creating scenarios.

  6. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    Science.gov (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  7. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  8. An introduction to Item Response Theory and Rasch Analysis of the Eating Assessment Tool (EAT-10).

    Science.gov (United States)

    Kean, Jacob; Brodke, Darrel S; Biber, Joshua; Gross, Paul

    2018-03-01

    Item response theory has its origins in educational measurement and is now commonly applied in health-related measurement of latent traits, such as function and symptoms. This application is due in large part to gains in the precision of measurement attributable to item response theory and corresponding decreases in response burden, study costs, and study duration. The purpose of this paper is twofold: introduce basic concepts of item response theory and demonstrate this analytic approach in a worked example, a Rasch model (1PL) analysis of the Eating Assessment Tool (EAT-10), a commonly used measure for oropharyngeal dysphagia. The results of the analysis were largely concordant with previous studies of the EAT-10 and illustrate for brain impairment clinicians and researchers how IRT analysis can yield greater precision of measurement.

  9. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  10. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  11. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  12. Automated cell analysis tool for a genome-wide RNAi screen with support vector machine based supervised learning

    Science.gov (United States)

    Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen

    2011-03-01

    RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.

  13. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  14. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  15. Applying Pragmatics Principles for Interaction with Visual Analytics.

    Science.gov (United States)

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  16. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    International Nuclear Information System (INIS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  17. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr; Paramythiotis, Spyridon, E-mail: pskan@aua.gr [Laboratory of Food Quality Control and Hygiene, Department of Food Science and Technology, Agricultural University of Athens, Iera Odos 75, 118 55, Athens (Greece)

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  18. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    Science.gov (United States)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  19. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  20. Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining

    Science.gov (United States)

    Rizzuti, S.; Umbrello, D.

    2011-01-01

    Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.

  1. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    Science.gov (United States)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  2. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; Garcia-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; Da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  3. Modern problems in applied analysis

    CERN Document Server

    Rogosin, Sergei

    2018-01-01

    This book features a collection of recent findings in Applied Real and Complex Analysis that were presented at the 3rd International Conference “Boundary Value Problems, Functional Equations and Applications” (BAF-3), held in Rzeszow, Poland on 20-23 April 2016. The contributions presented here develop a technique related to the scope of the workshop and touching on the fields of differential and functional equations, complex and real analysis, with a special emphasis on topics related to boundary value problems. Further, the papers discuss various applications of the technique, mainly in solid mechanics (crack propagation, conductivity of composite materials), biomechanics (viscoelastic behavior of the periodontal ligament, modeling of swarms) and fluid dynamics (Stokes and Brinkman type flows, Hele-Shaw type flows). The book is addressed to all readers who are interested in the development and application of innovative research results that can help solve theoretical and real-world problems.

  4. Fourier convergence analysis applied to neutron diffusion Eigenvalue problem

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Noh, Jae Man; Joo, Hyung Kook

    2004-01-01

    Fourier error analysis has been a standard technique for the stability and convergence analysis of linear and nonlinear iterative methods. Though the methods can be applied to Eigenvalue problems too, all the Fourier convergence analyses have been performed only for fixed source problems and a Fourier convergence analysis for Eigenvalue problem has never been reported. Lee et al proposed new 2-D/1-D coupling methods and they showed that the new ones are unconditionally stable while one of the two existing ones is unstable at a small mesh size and that the new ones are better than the existing ones in terms of the convergence rate. In this paper the convergence of method A in reference 4 for the diffusion Eigenvalue problem was analyzed by the Fourier analysis. The Fourier convergence analysis presented in this paper is the first one applied to a neutronics eigenvalue problem to the best of our knowledge

  5. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  6. Analysis of Minimum Quantity Lubrication (MQL for Different Coating Tools during Turning of TC11 Titanium Alloy

    Directory of Open Access Journals (Sweden)

    Sheng Qin

    2016-09-01

    Full Text Available The tool coating and cooling strategy are two key factors when machining difficult-to-cut materials such as titanium alloy. In this paper, diamond coating was deposited on a commercial carbide insert as an attempt to increase the machinability of TC11 alloy during the turning process. An uncoated carbide insert and a commercial Al2O3/TiAlN-coated tool were also tested as a comparison. Furthermore, MQL was applied to improve the cutting condition. Cutting performances were analyzed by cutting force, cutting temperate and surface roughness measurements. Tool wears and tool lives were evaluated to find a good matchup between the tool coating and cooling strategy. According to the results, using MQL can slightly reduce the cutting force. By applying MQL, cutting temperatures and tool wears were reduced by a great amount. Besides, MQL can affect the tool wear mechanism and tool failure modes. The tool life of an Al2O3/TiAlN-coated tool can be prolonged by 88.4% under the MQL condition. Diamond-coated tools can obtain a good surface finish when cutting parameters and lubrication strategies are properly chosen.

  7. Applied multidimensional scaling and unfolding

    CERN Document Server

    Borg, Ingwer; Mair, Patrick

    2018-01-01

    This book introduces multidimensional scaling (MDS) and unfolding as data analysis techniques for applied researchers. MDS is used for the analysis of proximity data on a set of objects, representing the data as distances between points in a geometric space (usually of two dimensions). Unfolding is a related method that maps preference data (typically evaluative ratings of different persons on a set of objects) as distances between two sets of points (representing the persons and the objects, resp.). This second edition has been completely revised to reflect new developments and the coverage of unfolding has also been substantially expanded. Intended for applied researchers whose main interests are in using these methods as tools for building substantive theories, it discusses numerous applications (classical and recent), highlights practical issues (such as evaluating model fit), presents ways to enforce theoretical expectations for the scaling solutions, and addresses the typical mistakes that MDS/unfoldin...

  8. Control of solid tobacco emissions in industrial fact ories applying CDF tools

    Directory of Open Access Journals (Sweden)

    G Polanco

    2016-09-01

    Full Text Available The emission of light solid aromatic particles from any tobacco industry affects the surrounding inhabitants, commonly causing allergies and eye irritation and, of course, uncomfortable odours, therefore, these emissions to the air must be regulated. An increasing in production must be considered in the sizing of mechanisms used to achieve the precipitation and final filtration, before discharging to the atmosphere. A numerical tool was applied to study the internal behaviour of low velocity precipitation tunnel and discharge chimney of the refuses treatment system. The characterization of the two-phase flow streamlines allows determining the velocity gradient profiles across the whole tunnel; which is intimately related with the particle concentration, and deposition zones locations. The application of CFD techniques gives the bases to find new design parameters to improve the precipitation tunnel behaviour capability to manage the increment of the mass flow of particles, due to changes in mass cigarette production.

  9. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  10. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  11. Tools for Authentication

    International Nuclear Information System (INIS)

    White, G.

    2008-01-01

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work

  12. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives.

    Science.gov (United States)

    Nichio, Bruno T L; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu

    2017-01-01

    Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST "all-against-all" methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing

  13. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives

    Directory of Open Access Journals (Sweden)

    Bruno T. L. Nichio

    2017-10-01

    Full Text Available Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST “all-against-all” methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm; or proteinOrtho (which improves the accuracy of ortholog groups; or ReMark (tackling the integration of the pipeline to turn the entry process automatic; or OrthAgogue (using algorithms developed to

  14. Applying decision trial and evaluation laboratory as a decision tool for effective safety management system in aviation transport

    Directory of Open Access Journals (Sweden)

    Ifeanyichukwu Ebubechukwu Onyegiri

    2016-10-01

    Full Text Available In recent years, in the aviation industry, the weak engineering controls and lapses associated with safety management systems (SMSs are responsible for the seemingly unprecedented disasters. A previous study has confirmed the difficulties experienced by safety managers with SMSs and the need to direct research to this area of investigation for more insights and progress in the evaluation and maintenance of SMSs in the aviation industry. The purpose of this work is to examine the application of Decision Trial and Evaluation Laboratory (DEMATEL to the aviation industry in developing countries with illustration using the Nigerian aviation survey data for the validation of the method. The advantage of the procedure over other decision making methods is in its ability to apply feedback in its decision making. It also affords us the opportunity of breaking down the complex aviation SMS components and elements which are multi-variate in nature through the analysis of the contributions of the diverse system criteria from the perspective of cause and effects, which in turn yields easier and yet more effective aviation transportation accident pre-corrective actions. In this work, six revised components of an SMS were identified and DEMATEL was applied to obtain their direct and indirect impacts and influences on the overall SMS performance. Data collection was by the survey questionnaire, which served as the initial direct-relation matrix, coded in Matlab software for establishing the impact relation map (IRM. The IRM was then plotted in MS Excel spread-sheet software. From our results, safety structure and regulation has the highest impact level on an SMS with a corresponding positive relation level value. In conclusion, the results agree with those of previous researchers that used grey relational analysis. Thus, DEMATEL serves as a great tool and resource for the safety manager.

  15. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    Science.gov (United States)

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  16. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  17. An IMU-to-Body Alignment Method Applied to Human Gait Analysis

    Directory of Open Access Journals (Sweden)

    Laura Susana Vargas-Valencia

    2016-12-01

    Full Text Available This paper presents a novel calibration procedure as a simple, yet powerful, method to place and align inertial sensors with body segments. The calibration can be easily replicated without the need of any additional tools. The proposed method is validated in three different applications: a computer mathematical simulation; a simplified joint composed of two semi-spheres interconnected by a universal goniometer; and a real gait test with five able-bodied subjects. Simulation results demonstrate that, after the calibration method is applied, the joint angles are correctly measured independently of previous sensor placement on the joint, thus validating the proposed procedure. In the cases of a simplified joint and a real gait test with human volunteers, the method also performs correctly, although secondary plane errors appear when compared with the simulation results. We believe that such errors are caused by limitations of the current inertial measurement unit (IMU technology and fusion algorithms. In conclusion, the presented calibration procedure is an interesting option to solve the alignment problem when using IMUs for gait analysis.

  18. Parameter estimation and determinability analysis applied to Drosophila gap gene circuits

    Directory of Open Access Journals (Sweden)

    Jaeger Johannes

    2008-09-01

    Full Text Available Abstract Background Mathematical modeling of real-life processes often requires the estimation of unknown parameters. Once the parameters are found by means of optimization, it is important to assess the quality of the parameter estimates, especially if parameter values are used to draw biological conclusions from the model. Results In this paper we describe how the quality of parameter estimates can be analyzed. We apply our methodology to assess parameter determinability for gene circuit models of the gap gene network in early Drosophila embryos. Conclusion Our analysis shows that none of the parameters of the considered model can be determined individually with reasonable accuracy due to correlations between parameters. Therefore, the model cannot be used as a tool to infer quantitative regulatory weights. On the other hand, our results show that it is still possible to draw reliable qualitative conclusions on the regulatory topology of the gene network. Moreover, it improves previous analyses of the same model by allowing us to identify those interactions for which qualitative conclusions are reliable, and those for which they are ambiguous.

  19. Applying New Diabetes Teaching Tools in Health-Related Extension Programming

    Science.gov (United States)

    Grenci, Alexandra

    2010-01-01

    In response to the emerging global diabetes epidemic, health educators are searching for new and better education tools to help people make positive behavior changes to successfully prevent or manage diabetes. Conversation Maps[R] are new learner-driven education tools that have been developed to empower individuals to improve their health…

  20. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  1. Applied linear algebra

    CERN Document Server

    Olver, Peter J

    2018-01-01

    This textbook develops the essential tools of linear algebra, with the goal of imparting technique alongside contextual understanding. Applications go hand-in-hand with theory, each reinforcing and explaining the other. This approach encourages students to develop not only the technical proficiency needed to go on to further study, but an appreciation for when, why, and how the tools of linear algebra can be used across modern applied mathematics. Providing an extensive treatment of essential topics such as Gaussian elimination, inner products and norms, and eigenvalues and singular values, this text can be used for an in-depth first course, or an application-driven second course in linear algebra. In this second edition, applications have been updated and expanded to include numerical methods, dynamical systems, data analysis, and signal processing, while the pedagogical flow of the core material has been improved. Throughout, the text emphasizes the conceptual connections between each application and the un...

  2. Animal research in the Journal of Applied Behavior Analysis.

    Science.gov (United States)

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  3. Finite elements for analysis and design

    CERN Document Server

    Akin, J E; Davenport, J H

    1994-01-01

    The finite element method (FEM) is an analysis tool for problem-solving used throughout applied mathematics, engineering, and scientific computing. Finite Elements for Analysis and Design provides a thoroughlyrevised and up-to-date account of this important tool and its numerous applications, with added emphasis on basic theory. Numerous worked examples are included to illustrate the material.Key Features* Akin clearly explains the FEM, a numerical analysis tool for problem-solving throughout applied mathematics, engineering and scientific computing* Basic theory has bee

  4. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    Science.gov (United States)

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  5. Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.

    Science.gov (United States)

    Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M

    2012-04-01

    We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.

  6. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  7. The cumulative verification image analysis tool for offline evaluation of portal images

    International Nuclear Information System (INIS)

    Wong, John; Yan Di; Michalski, Jeff; Graham, Mary; Halverson, Karen; Harms, William; Purdy, James

    1995-01-01

    Purpose: Daily portal images acquired using electronic portal imaging devices contain important information about the setup variation of the individual patient. The data can be used to evaluate the treatment and to derive correction for the individual patient. The large volume of images also require software tools for efficient analysis. This article describes the approach of cumulative verification image analysis (CVIA) specifically designed as an offline tool to extract quantitative information from daily portal images. Methods and Materials: The user interface, image and graphics display, and algorithms of the CVIA tool have been implemented in ANSCI C using the X Window graphics standards. The tool consists of three major components: (a) definition of treatment geometry and anatomical information; (b) registration of portal images with a reference image to determine setup variation; and (c) quantitative analysis of all setup variation measurements. The CVIA tool is not automated. User interaction is required and preferred. Successful alignment of anatomies on portal images at present remains mostly dependent on clinical judgment. Predefined templates of block shapes and anatomies are used for image registration to enhance efficiency, taking advantage of the fact that much of the tool's operation is repeated in the analysis of daily portal images. Results: The CVIA tool is portable and has been implemented on workstations with different operating systems. Analysis of 20 sequential daily portal images can be completed in less than 1 h. The temporal information is used to characterize setup variation in terms of its systematic, random and time-dependent components. The cumulative information is used to derive block overlap isofrequency distributions (BOIDs), which quantify the effective coverage of the prescribed treatment area throughout the course of treatment. Finally, a set of software utilities is available to facilitate feedback of the information for

  8. Applied data mining for business and industry

    CERN Document Server

    Giudici, Paolo

    2009-01-01

    The increasing availability of data in our current, information overloaded society has led to the need for valid tools for its modelling and analysis. Data mining and applied statistical methods are the appropriate tools to extract knowledge from such data. This book provides an accessible introduction to data mining methods in a consistent and application oriented statistical framework, using case studies drawn from real industry projects and highlighting the use of data mining methods in a variety of business applications. Introduces data mining methods and applications.Covers classical and Bayesian multivariate statistical methodology as well as machine learning and computational data mining methods.Includes many recent developments such as association and sequence rules, graphical Markov models, lifetime value modelling, credit risk, operational risk and web mining.Features detailed case studies based on applied projects within industry.Incorporates discussion of data mining software, with case studies a...

  9. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  10. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  11. Nanopore sequencing technology and tools for genome assembly: computational analysis of the current state, bottlenecks and future directions.

    Science.gov (United States)

    Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur

    2018-04-02

    Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious

  12. Introduction, comparison, and validation of Meta-Essentials : A free and simple tool for meta-analysis

    NARCIS (Netherlands)

    R. Suurmond (Robert); H.J. van Rhee (Henk); A. Hak (Tony)

    2017-01-01

    markdownabstractWe present a new tool for meta‐analysis, _Meta‐Essentials_, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis.We also provide detailed information on the validation of the tool. Although free of

  13. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  14. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  15. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  16. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  17. Water-food-energy nexus index: analysis of water-energy-food nexus of crop's production system applying the indicators approach

    Science.gov (United States)

    El-Gafy, Inas

    2017-10-01

    Analysis the water-food-energy nexus is the first step to assess the decision maker in developing and evaluating national strategies that take into account the nexus. The main objective of the current research is providing a method for the decision makers to analysis the water-food-energy nexus of the crop production system at the national level and carrying out a quantitative assessment of it. Through the proposed method, indicators considering the water and energy consumption, mass productivity, and economic productivity were suggested. Based on these indicators a water-food-energy nexus index (WFENI) was performed. The study showed that the calculated WFENI of the Egyptian summer crops have scores that range from 0.21 to 0.79. Comparing to onion (the highest scoring WFENI,i.e., the best score), rice has the lowest WFENI among the summer food crops. Analysis of the water-food-energy nexus of forty-two Egyptian crops in year 2010 was caried out (energy consumed for irrigation represent 7.4% of the total energy footprint). WFENI can be applied to developed strategies for the optimal cropping pattern that minimizing the water and energy consumption and maximizing their productivity. It can be applied as a holistic tool to evaluate the progress in the water and agricultural national strategies. Moreover, WFENI could be applied yearly to evaluate the performance of the water-food-energy nexus managmant.

  18. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  19. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  20. EZ and GOSSIP, two new VO compliant tools for spectral analysis

    Science.gov (United States)

    Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.

    2008-10-01

    We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.

  1. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  2. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  3. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  4. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  5. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  6. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  7. MEL-IRIS: An Online Tool for Audio Analysis and Music Indexing

    Directory of Open Access Journals (Sweden)

    Dimitrios Margounakis

    2009-01-01

    Full Text Available Chroma is an important attribute of music and sound, although it has not yet been adequately defined in literature. As such, it can be used for further analysis of sound, resulting in interesting colorful representations that can be used in many tasks: indexing, classification, and retrieval. Especially in Music Information Retrieval (MIR, the visualization of the chromatic analysis can be used for comparison, pattern recognition, melodic sequence prediction, and color-based searching. MEL-IRIS is the tool which has been developed in order to analyze audio files and characterize music based on chroma. The tool implements specially designed algorithms and a unique way of visualization of the results. The tool is network-oriented and can be installed in audio servers, in order to manipulate large music collections. Several samples from world music have been tested and processed, in order to demonstrate the possible uses of such an analysis.

  8. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  9. Calibration apparatus for a machine-tool

    International Nuclear Information System (INIS)

    Crespin, G.

    1985-01-01

    The invention proposes a calibration apparatus for a machine-tool comprising a torque measuring device, where the tool is driven by a motor of which supply electric current is proportional to the torque applied upon the tool and can be controlled and measured, a housing having an aperture through which the rotatable tool can pass. This device alloys to apply a torque on the tool and to measure it from the supply current of the motor. The invention applies, more particularly to the screwing machines used for the mounting of the core containment plates [fr

  10. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  11. CHESS (CgHExpreSS): a comprehensive analysis tool for the analysis of genomic alterations and their effects on the expression profile of the genome.

    Science.gov (United States)

    Lee, Mikyung; Kim, Yangseok

    2009-12-16

    Genomic alterations frequently occur in many cancer patients and play important mechanistic roles in the pathogenesis of cancer. Furthermore, they can modify the expression level of genes due to altered copy number in the corresponding region of the chromosome. An accumulating body of evidence supports the possibility that strong genome-wide correlation exists between DNA content and gene expression. Therefore, more comprehensive analysis is needed to quantify the relationship between genomic alteration and gene expression. A well-designed bioinformatics tool is essential to perform this kind of integrative analysis. A few programs have already been introduced for integrative analysis. However, there are many limitations in their performance of comprehensive integrated analysis using published software because of limitations in implemented algorithms and visualization modules. To address this issue, we have implemented the Java-based program CHESS to allow integrative analysis of two experimental data sets: genomic alteration and genome-wide expression profile. CHESS is composed of a genomic alteration analysis module and an integrative analysis module. The genomic alteration analysis module detects genomic alteration by applying a threshold based method or SW-ARRAY algorithm and investigates whether the detected alteration is phenotype specific or not. On the other hand, the integrative analysis module measures the genomic alteration's influence on gene expression. It is divided into two separate parts. The first part calculates overall correlation between comparative genomic hybridization ratio and gene expression level by applying following three statistical methods: simple linear regression, Spearman rank correlation and Pearson's correlation. In the second part, CHESS detects the genes that are differentially expressed according to the genomic alteration pattern with three alternative statistical approaches: Student's t-test, Fisher's exact test and Chi square

  12. How to apply clinical cases and medical literature in the framework of a modified "failure mode and effects analysis" as a clinical reasoning tool--an illustration using the human biliary system.

    Science.gov (United States)

    Wong, Kam Cheong

    2016-04-06

    Clinicians use various clinical reasoning tools such as Ishikawa diagram to enhance their clinical experience and reasoning skills. Failure mode and effects analysis, which is an engineering methodology in origin, can be modified and applied to provide inputs into an Ishikawa diagram. The human biliary system is used to illustrate a modified failure mode and effects analysis. The anatomical and physiological processes of the biliary system are reviewed. Failure is defined as an abnormality caused by infective, inflammatory, obstructive, malignancy, autoimmune and other pathological processes. The potential failures, their effect(s), main clinical features, and investigation that can help a clinician to diagnose at each anatomical part and physiological process are reviewed and documented in a modified failure mode and effects analysis table. Relevant medical and surgical cases are retrieved from the medical literature and weaved into the table. A total of 80 clinical cases which are relevant to the modified failure mode and effects analysis for the human biliary system have been reviewed and weaved into a designated table. The table is the backbone and framework for further expansion. Reviewing and updating the table is an iterative and continual process. The relevant clinical features in the modified failure mode and effects analysis are then extracted and included in the relevant Ishikawa diagram. This article illustrates an application of engineering methodology in medicine, and it sows the seeds of potential cross-pollination between engineering and medicine. Establishing a modified failure mode and effects analysis can be a teamwork project or self-directed learning process, or a mix of both. Modified failure mode and effects analysis can be deployed to obtain inputs for an Ishikawa diagram which in turn can be used to enhance clinical experiences and clinical reasoning skills for clinicians, medical educators, and students.

  13. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    Science.gov (United States)

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  14. Developing new chemical tools for solvent extraction

    International Nuclear Information System (INIS)

    Moyer, B.A.; Baes, C.F.; Burns, J.H.; Case, G.N.; Sachleben, R.A.; Bryan, S.A.; Lumetta, G.J.; McDowell, W.J.; Sachleben, R.A.

    1993-01-01

    Prospects for innovation and for greater technological impact in the field of solvent extraction (SX) seem as bright as ever, despite the maturation of SX as an economically significant separation method and as an important technique in the laboratory. New industrial, environmental, and analytical problems provide compelling motivation for diversifying the application of SX, developing new solvent systems, and seeking improved properties. Toward this end, basic research must be dedicated to enhancing the tools of SX: physical tools for probing the basis of extraction and molecular tools for developing new SX chemistries. In this paper, the authors describe their progress in developing and applying the general tools of equilibrium analysis and of ion recognition in SX. Nearly half a century after the field of SX began in earnest, coordination chemistry continues to provide the impetus for important advancements in understanding SX systems and in controlling SX chemistry. In particular, the physical tools of equilibrium analysis, X-ray crystallography, and spectroscopy are elucidating the molecular basis of SX in unprecedented detail. Moreover, the principles of ion recognition are providing the molecular tools with which to achieve new selectivities and new applications

  15. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  16. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  17. Cost-benefit analysis model: A tool for area-wide fruit fly management. Procedures manual

    International Nuclear Information System (INIS)

    Enkerlin, W.; Mumford, J.; Leach, A.

    2007-03-01

    The Generic Fruit Fly Cost-Benefit Analysis Model assists in economic decision making associated with area-wide fruit fly control options. The FRUIT FLY COST-BENEFIT ANALYSIS PROGRAM (available on 1 CD-ROM from the Joint FAO/IAEA Programme of Nuclear Techniques in Food and Agriculture) is an Excel 2000 Windows based program, for which all standard Windows and Excel conventions apply. The Model is user friendly and thus largely self-explanatory. Nevertheless, it includes a procedures manual that has been prepared to guide the user, and thus should be used together with the software. Please note that the table presenting the pest management options in the Introductory Page of the model is controlled by spin buttons and click boxes. These controls are linked to macros that hide non relevant tables and boxes. N.B. it is important that the medium level of security is selected from the Tools menu of Excel, to do this go to Tools|Macros|Security| and select Medium. When the file is opened a form will appear containing three buttons, click on the middle button, 'Enable Macros', so that the macros may be used. Ideally the model should be used as a support tool by working groups aiming at assessing the economic returns of different fruit fly control options (suppression, eradication, containment and prevention). The working group should include professionals in agriculture with experience in area-wide implementation of integrated pest management programmes, an economist or at least someone with basic knowledge in economics, and if relevant, an entomologist with some background in the application of the sterile insect technique (SIT)

  18. A practical guide to propensity score analysis for applied clinical research.

    Science.gov (United States)

    Lee, Jaehoon; Little, Todd D

    2017-11-01

    Observational studies are often the only viable options in many clinical settings, especially when it is unethical or infeasible to randomly assign participants to different treatment régimes. In such case propensity score (PS) analysis can be applied to accounting for possible selection bias and thereby addressing questions of causal inference. Many PS methods exist, yet few guidelines are available to aid applied researchers in their conduct and evaluation of a PS analysis. In this article we give an overview of available techniques for PS estimation and application, balance diagnostic, treatment effect estimation, and sensitivity assessment, as well as recent advances. We also offer a tutorial that can be used to emulate the steps of PS analysis. Our goal is to provide information that will bring PS analysis within the reach of applied clinical researchers and practitioners. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  20. Tool Wear Analysis due to Machining In Super Austenitic Stainless Steel

    Directory of Open Access Journals (Sweden)

    Polishetty Ashwin

    2017-01-01

    Full Text Available This paper presents tool wear study when a machinability test was applied using milling on Super Austenitic Stainless Steel AL6XN alloy. Eight milling trials were performed under two cutting speeds, 100 m/min and 150 m/min, combined with two feed rates at 0.1mm/tooth and 0.15 mm/tooth and two depth of cuts at 2 mm and 3 mm. An Alicona 3D optical surface profilometer was used to scan cutting inserts flank and rake face areas for wear. Readings such as maximum and minimum deviations were extracted and used to analyse the outcomes. Results showed various types of wear were generated on the tool rake and flank faces. The common formed wear was the crater wear. The formation of the build-up edge was observed on the rake face of the cutting tool.

  1. A compilation of Web-based research tools for miRNA analysis.

    Science.gov (United States)

    Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu

    2017-09-01

    Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  3. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  4. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    OpenAIRE

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid mod...

  5. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    Energy Technology Data Exchange (ETDEWEB)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  6. Virtual tool mark generation for efficient striation analysis in forensic science

    Energy Technology Data Exchange (ETDEWEB)

    Ekstrand, Laura [Iowa State Univ., Ames, IA (United States)

    2012-01-01

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles

  7. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    Science.gov (United States)

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  8. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  9. A new design of automatic vertical drilling tool

    Directory of Open Access Journals (Sweden)

    Yanfeng Ma

    2015-09-01

    Full Text Available In order to effectively improve penetration rates and enhance wellbore quality for vertical wells, a new Automatic Vertical Drilling Tool (AVDT based on Eccentric Braced Structure (EBS is designed. Applying operating principle of rotary steering drilling, AVDT adds offset gravity block automatic induction inclination mechanism. When hole straightening happens, tools take essentric moment to be produced by gravity of offset gravity lock to control the bearing of guide force, so that well straightening is achieved. The normal tool's size of the AVDT is designed as 215.9 mm,other major components' sizes are worked out by the result of theoretical analysis, including the offset angle of EBS. This paper aims to introduce the structure, operating principle, theoretical analysis and describe the key components' parameters setting of the AVDT.

  10. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot...... Structural Analysis. Further the tool provides intuitive setup and visual aids in order to facilitate the process. Enabling students and professionals to quickly analyze and evaluate multiple design variations. The tool has been developed inside the Performance Aided Design course at the Master...... of Architecture and Design at Aalborg University...

  11. Applied spectrophotometry: analysis of a biochemical mixture.

    Science.gov (United States)

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. Copyright © 2013 Wiley Periodicals, Inc.

  12. Computer assisted audit tools and techniques in real world: CAATT's applications and approaches in context

    OpenAIRE

    Pedrosa, I.; Costa, C. J.

    2012-01-01

    Nowadays, Computer Aided Audit Tools (and Techniques’) support almost all audit processes concerning data extraction and analysis. These tools were firstly aimed to support financial auditing processes. However, their scope is beyond this, therefore, we present case studies and good practices in an academic context. Although in large auditing companies Audit Tools to do data extraction and analysis are very common and applied in several contexts, we realized that is not easy to find practical...

  13. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    Science.gov (United States)

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  14. Image edge detection based tool condition monitoring with morphological component analysis.

    Science.gov (United States)

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Analysis of mechanism of carbide tool wear and control by wear process

    Directory of Open Access Journals (Sweden)

    Pham Hoang Trung

    2017-01-01

    Full Text Available The analysis of physic-mechanical and thermal physic properties of hard alloys depending on their chemical composition is conducted. The correlation of cutting properties and regularities of carbide tool wear with cutting conditions and thermal physic properties of tool material are disclosed. Significant influence on the tool wear of not only mechanical, but, in the first place, thermal physic properties of tool and structural materials is established by the researches of Russian scientists, because in the range of industrial used cutting speeds the cause of tool wear are diffusion processes. The directions of intensity decreasing of tool wear by determining rational processing conditions, the choice of tool materials and wear-resistant coating on tool surface are defined.

  16. The prevention of mother-to-child transmission of HIV cascade analysis tool: supporting health managers to improve facility-level service delivery.

    Science.gov (United States)

    Gimbel, Sarah; Voss, Joachim; Mercer, Mary Anne; Zierler, Brenda; Gloyd, Stephen; Coutinho, Maria de Joana; Floriano, Florencia; Cuembelo, Maria de Fatima; Einberg, Jennifer; Sherr, Kenneth

    2014-10-21

    The objective of the prevention of Mother-to-Child Transmission (pMTCT) cascade analysis tool is to provide frontline health managers at the facility level with the means to rapidly, independently and quantitatively track patient flows through the pMTCT cascade, and readily identify priority areas for clinic-level improvement interventions. Over a period of six months, five experienced maternal-child health managers and researchers iteratively adapted and tested this systems analysis tool for pMTCT services. They prioritized components of the pMTCT cascade for inclusion, disseminated multiple versions to 27 health managers and piloted it in five facilities. Process mapping techniques were used to chart PMTCT cascade steps in these five facilities, to document antenatal care attendance, HIV testing and counseling, provision of prophylactic anti-retrovirals, safe delivery, safe infant feeding, infant follow-up including HIV testing, and family planning, in order to obtain site-specific knowledge of service delivery. Seven pMTCT cascade steps were included in the Excel-based final tool. Prevalence calculations were incorporated as sub-headings under relevant steps. Cells not requiring data inputs were locked, wording was simplified and stepwise drop-offs and maximization functions were included at key steps along the cascade. While the drop off function allows health workers to rapidly assess how many patients were lost at each step, the maximization function details the additional people served if only one step improves to 100% capacity while others stay constant. Our experience suggests that adaptation of a cascade analysis tool for facility-level pMTCT services is feasible and appropriate as a starting point for discussions of where to implement improvement strategies. The resulting tool facilitates the engagement of frontline health workers and managers who fill out, interpret, apply the tool, and then follow up with quality improvement activities. Research on

  17. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    Science.gov (United States)

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  18. Applied ecosystem analysis - a primer; the ecosystem diagnosis and treatment method

    International Nuclear Information System (INIS)

    Lestelle, L.C.; Mobrand, L.E.; Lichatowich, J.A.; Vogel, T.S.

    1996-05-01

    The aim of this document is to inform and instruct the reader about an approach to ecosystem management that is based upon salmon as an indicator species. It is intended to provide natural resource management professionals with the background information needed to answer questions about why and how to apply the approach. The methods and tools the authors describe are continually updated and refined, so this primer should be treated as a first iteration of a sequentially revised manual

  19. Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho

    Science.gov (United States)

    Andrew J. McMahan; Eric L. Smith

    2006-01-01

    Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing “tools”--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...

  20. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  1. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant

    International Nuclear Information System (INIS)

    Shariff, Azmi Mohd; Zaini, Dzulkarnain

    2010-01-01

    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage.

  2. A population MRI brain template and analysis tools for the macaque.

    Science.gov (United States)

    Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam

    2018-04-15

    The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.

  3. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  4. Status of CONRAD, a nuclear reaction analysis tool

    International Nuclear Information System (INIS)

    Saint Jean, C. de; Habert, B.; Litaize, O.; Noguere, G.; Suteau, C.

    2008-01-01

    The development of a software tool (CONRAD) was initiated at CEA/Cadarache to give answers to various problems arising in the data analysis of nuclear reactions. This tool is then characterized by the handling of uncertainties from experimental values to covariance matrices for multi-group cross sections. An object oriented design was chosen allowing an easy interface with graphical tool for input/output data and being a natural framework for innovative nuclear models (Fission). The major achieved developments are a data model for describing channels, nuclear reactions, nuclear models and processes with interface to classical data formats, theoretical calculations for the resolved resonance range (Reich-Moore) and unresolved resonance range (Hauser-Feshbach, Gilbert-Cameron,...) with nuclear model parameters adjustment on experimental data sets and a Monte Carlo method based on conditional probabilities developed to calculate properly covariance matrices. The on-going developments deal with the experimental data description (covariance matrices) and the graphical user interface. (authors)

  5. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... representation of each spectrum. Subset selection of wavelet coecients generates the input to mixed models. Mixed-model methodology enables us to take the study design into account while modelling covariates. Bootstrap-based inference preserves the correlation structure between curves and enables the estimation...

  6. Applied mechanics of solids

    CERN Document Server

    Bower, Allan F

    2009-01-01

    Modern computer simulations make stress analysis easy. As they continue to replace classical mathematical methods of analysis, these software programs require users to have a solid understanding of the fundamental principles on which they are based. Develop Intuitive Ability to Identify and Avoid Physically Meaningless Predictions Applied Mechanics of Solids is a powerful tool for understanding how to take advantage of these revolutionary computer advances in the field of solid mechanics. Beginning with a description of the physical and mathematical laws that govern deformation in solids, the text presents modern constitutive equations, as well as analytical and computational methods of stress analysis and fracture mechanics. It also addresses the nonlinear theory of deformable rods, membranes, plates, and shells, and solutions to important boundary and initial value problems in solid mechanics. The author uses the step-by-step manner of a blackboard lecture to explain problem solving methods, often providing...

  7. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    Science.gov (United States)

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  8. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment.

    Science.gov (United States)

    Yoo, Doo Han; Lee, Jae Shin

    2016-07-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.

  9. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    Science.gov (United States)

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  10. Decoding the genome with an integrative analysis tool: combinatorial CRM Decoder.

    Science.gov (United States)

    Kang, Keunsoo; Kim, Joomyeong; Chung, Jae Hoon; Lee, Daeyoup

    2011-09-01

    The identification of genome-wide cis-regulatory modules (CRMs) and characterization of their associated epigenetic features are fundamental steps toward the understanding of gene regulatory networks. Although integrative analysis of available genome-wide information can provide new biological insights, the lack of novel methodologies has become a major bottleneck. Here, we present a comprehensive analysis tool called combinatorial CRM decoder (CCD), which utilizes the publicly available information to identify and characterize genome-wide CRMs in a species of interest. CCD first defines a set of the epigenetic features which is significantly associated with a set of known CRMs as a code called 'trace code', and subsequently uses the trace code to pinpoint putative CRMs throughout the genome. Using 61 genome-wide data sets obtained from 17 independent mouse studies, CCD successfully catalogued ∼12 600 CRMs (five distinct classes) including polycomb repressive complex 2 target sites as well as imprinting control regions. Interestingly, we discovered that ∼4% of the identified CRMs belong to at least two different classes named 'multi-functional CRM', suggesting their functional importance for regulating spatiotemporal gene expression. From these examples, we show that CCD can be applied to any potential genome-wide datasets and therefore will shed light on unveiling genome-wide CRMs in various species.

  11. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  12. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  13. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  14. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...

  15. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.

    2015-01-01

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  16. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  17. Design and Analysis of a Collision Detector for Hybrid Robotic Machine Tools

    Directory of Open Access Journals (Sweden)

    Dan ZHANG

    2015-10-01

    Full Text Available Capacitive sensing depends on the physical parameter changing either the spacing between the two plates or the dielectric constant. Based on this idea, a capacitive based collision detection sensor is proposed and designed in this paper for the purpose of detecting any collision between the end effector and peripheral equipment (e.g., fixture for the three degrees of freedom hybrid robotic machine tools when it is in operation. One side of the finger-like capacitor is attached to the moving platform of the hybrid robotic manipulator and the other side of the finger-like capacitor is attached to the tool. When the tool accidently hits the peripheral equipment, the vibration will make the distance of the capacitor change and therefore trigger the machine to stop. The new design is illustrated and modelled. The capacitance, sensitivity and frequency response of the detector are analyzed in detail, and finally, the fabrication process is presented. The proposed collision detector can also be applied to other machine tools.

  18. Applied Behavior Analysis: Current Myths in Public Education

    Science.gov (United States)

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  19. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  20. RdTools: An Open Source Python Library for PV Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Deceglie, Michael G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nag, Ambarish [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Shinn, Adam [kWh Analytics

    2018-05-04

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  1. Microscopy image segmentation tool: Robust image data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  2. Microscopy image segmentation tool: Robust image data analysis

    Science.gov (United States)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  3. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  4. A developmental screening tool for toddlers with multiple domains based on Rasch analysis

    Directory of Open Access Journals (Sweden)

    Ai-Wen Hwang

    2015-01-01

    Conclusion: MuSiC can be applied simultaneously to well-child care visits as a universal screening tool for children aged 1–3 years on multiple domains. Items with sound validity for infants need to be further developed.

  5. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  6. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    Science.gov (United States)

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  7. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  8. SACA: Software Assisted Call Analysis--an interactive tool supporting content exploration, online guidance and quality improvement of counseling dialogues.

    Science.gov (United States)

    Trinkaus, Hans L; Gaisser, Andrea E

    2010-09-01

    Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  10. Addressing the alarm analysis barrier - a tool for improving alarm systems

    Energy Technology Data Exchange (ETDEWEB)

    Davey, E C; Basso, R A; Feher, M P [Atomic Energy of Canada Ltd., Chalk River, ON (Canada)

    1996-12-31

    This paper describes a software application tool for the initial specification and maintenance of the thousands of alarms in nuclear and other process control plants. The software program is used by system designers and maintainers to analyze, characterize, record and maintain the alarm information and configuration decisions for an alarm system. The tool provides a comprehensive design and information handling environment for: the existing alarm functions in current CANDU plants; the new alarm processing and presentation concepts developed under CANDU Owners Group (COG) sponsorship that are available to be applied to existing CANDU plants on a retrofit basis; and, the alarm functions to be implemented in new CANDU plants. (author). 3 refs., 1 fig.

  11. Addressing the alarm analysis barrier - a tool for improving alarm systems

    International Nuclear Information System (INIS)

    Davey, E.C.; Basso, R.A.; Feher, M.P.

    1995-01-01

    This paper describes a software application tool for the initial specification and maintenance of the thousands of alarms in nuclear and other process control plants. The software program is used by system designers and maintainers to analyze, characterize, record and maintain the alarm information and configuration decisions for an alarm system. The tool provides a comprehensive design and information handling environment for: the existing alarm functions in current CANDU plants; the new alarm processing and presentation concepts developed under CANDU Owners Group (COG) sponsorship that are available to be applied to existing CANDU plants on a retrofit basis; and, the alarm functions to be implemented in new CANDU plants. (author). 3 refs., 1 fig

  12. Technology of geographical information systems applied to the licensing of nuclear sector installations

    International Nuclear Information System (INIS)

    Oliveira, Aline F.G. De; Barreto, Alberto A.; Carvalho Filho, Carlos A. de; Rodrigues, Paulo Cezar Horta; Moura, Igor Felipe Silva

    2017-01-01

    The nuclear licensing process involves the preparation of documents such as Local's Report (LR), Preliminary Safety Analysis Report (PSAR), Final Safety Analysis Report (FSAR), Physical Protection Plans, Radiation Protection Plans and Emergency plans that must be submit to the National Nuclear Energy Commission (DRS / CNEN) for approval. This work presents an analysis and a guide for the use of Geoprocessing tools in the updating of environmental studies necessary to update the Local's Report (LR) of the Center for the Development of Nuclear Technology (CDTN). The main purpose is to contribute to streamline the execution of steps involved in the nuclear licensing process, such as structuring and executing environmental studies, planning environmental monitoring activities, etc. To achieve the objective, we search for and obtained available data of high reliability in various organs using a methodological flowchart for data acquisition and treatment. The study was develop using the ArcMap 10.2 application from ArcGis, especially the Model Builder analytic tool. This tool allowed the (macro) schematization of the methodology from the applied GIS tools, which presents as advantages to the efficiency and optimization of the execution time of the procedures in situations where it is necessary to apply the same routine of tasks, besides the fact of being editable, which offers possibilities for adaptations and improvements. (author)

  13. Technology of geographical information systems applied to the licensing of nuclear sector installations

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Aline F.G. De; Barreto, Alberto A.; Carvalho Filho, Carlos A. de; Rodrigues, Paulo Cezar Horta [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Moura, Igor Felipe Silva, E-mail: afgo@cdtn.br, E-mail: aab@cdtn.br, E-mail: cacf@cdtn.br, E-mail: igorfelipedx@ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The nuclear licensing process involves the preparation of documents such as Local's Report (LR), Preliminary Safety Analysis Report (PSAR), Final Safety Analysis Report (FSAR), Physical Protection Plans, Radiation Protection Plans and Emergency plans that must be submit to the National Nuclear Energy Commission (DRS / CNEN) for approval. This work presents an analysis and a guide for the use of Geoprocessing tools in the updating of environmental studies necessary to update the Local's Report (LR) of the Center for the Development of Nuclear Technology (CDTN). The main purpose is to contribute to streamline the execution of steps involved in the nuclear licensing process, such as structuring and executing environmental studies, planning environmental monitoring activities, etc. To achieve the objective, we search for and obtained available data of high reliability in various organs using a methodological flowchart for data acquisition and treatment. The study was develop using the ArcMap 10.2 application from ArcGis, especially the Model Builder analytic tool. This tool allowed the (macro) schematization of the methodology from the applied GIS tools, which presents as advantages to the efficiency and optimization of the execution time of the procedures in situations where it is necessary to apply the same routine of tasks, besides the fact of being editable, which offers possibilities for adaptations and improvements. (author)

  14. Causal Relation Analysis Tool of the Case Study in the Engineer Ethics Education

    Science.gov (United States)

    Suzuki, Yoshio; Morita, Keisuke; Yasui, Mitsukuni; Tanada, Ichirou; Fujiki, Hiroyuki; Aoyagi, Manabu

    In engineering ethics education, the virtual experiencing of dilemmas is essential. Learning through the case study method is a particularly effective means. Many case studies are, however, difficult to deal with because they often include many complex causal relationships and social factors. It would thus be convenient if there were a tool that could analyze the factors of a case example and organize them into a hierarchical structure to get a better understanding of the whole picture. The tool that was developed applies a cause-and-effect matrix and simple graph theory. It analyzes the causal relationship between facts in a hierarchical structure and organizes complex phenomena. The effectiveness of this tool is shown by presenting an actual example.

  15. Statistical tools applied for the reduction of the defect rate of coffee degassing valves

    Directory of Open Access Journals (Sweden)

    Giorgio Olmi

    2015-04-01

    Full Text Available Coffee is a very common beverage exported all over the world: just after roasting, coffee beans are packed in plastic or paper bags, which then experience long transfers with long storage times. Fresh roasted coffee emits large amounts of CO2 for several weeks. This gas must be gradually released, to prevent package over-inflation and to preserve aroma, moreover beans must be protected from oxygen coming from outside. Therefore, one-way degassing valves are applied to each package: their correct functionality is strictly related to the interference coupling between their bodies and covers and to the correct assembly of the other involved parts. This work takes inspiration from an industrial problem: a company that assembles valve components, supplied by different manufacturers, observed a high level of defect rate, affecting its valve production. An integrated approach, consisting in the adoption of quality charts, in an experimental campaign for the dimensional analysis of the mating parts and in the statistical processing of the data, was necessary to tackle the question. In particular, a simple statistical tool was made available to predict the defect rate and to individuate the best strategy for its reduction. The outcome was that requiring a strict protocol, regarding the combinations of parts from different manufacturers for assembly, would have been almost ineffective. Conversely, this study led to the individuation of the weak point in the manufacturing process of the mating components and to the suggestion of a slight improvement to be performed, with the final result of a significant (one order of magnitude decrease of the defect rate.

  16. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  17. Applying differential dynamic logic to reconfigurable biological networks.

    Science.gov (United States)

    Figueiredo, Daniel; Martins, Manuel A; Chaves, Madalena

    2017-09-01

    Qualitative and quantitative modeling frameworks are widely used for analysis of biological regulatory networks, the former giving a preliminary overview of the system's global dynamics and the latter providing more detailed solutions. Another approach is to model biological regulatory networks as hybrid systems, i.e., systems which can display both continuous and discrete dynamic behaviors. Actually, the development of synthetic biology has shown that this is a suitable way to think about biological systems, which can often be constructed as networks with discrete controllers, and present hybrid behaviors. In this paper we discuss this approach as a special case of the reconfigurability paradigm, well studied in Computer Science (CS). In CS there are well developed computational tools to reason about hybrid systems. We argue that it is worth applying such tools in a biological context. One interesting tool is differential dynamic logic (dL), which has recently been developed by Platzer and applied to many case-studies. In this paper we discuss some simple examples of biological regulatory networks to illustrate how dL can be used as an alternative, or also as a complement to methods already used. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Can the FAST and ROSIER adult stroke recognition tools be applied to confirmed childhood arterial ischemic stroke?

    Directory of Open Access Journals (Sweden)

    Babl Franz E

    2011-10-01

    Full Text Available Abstract Background Stroke recognition tools have been shown to improve diagnostic accuracy in adults. Development of a similar tool in children is needed to reduce lag time to diagnosis. A critical first step is to determine whether adult stoke scales can be applied in childhood stroke. Our objective was to assess the applicability of adult stroke scales in childhood arterial ischemic stroke (AIS Methods Children aged 1 month to Results 47 children with AIS were identified. 34 had anterior, 12 had posterior and 1 child had anterior and posterior circulation infarcts. Median age was 9 years and 51% were male. Median time from symptom onset to ED presentation was 21 hours but one third of children presented within 6 hours. The most common presenting stroke symptoms were arm (63%, face (62%, leg weakness (57%, speech disturbance (46% and headache (46%. The most common signs were arm (61%, face (70% or leg weakness (57% and dysarthria (34%. 36 (78% of children had at least one positive variable on FAST and 38 (81% had a positive score of ≥1 on the ROSIER scale. Positive scores were less likely in children with posterior circulation stroke. Conclusion The presenting features of pediatric stroke appear similar to adult strokes. Two adult stroke recognition tools have fair to good sensitivity in radiologically confirmed childhood AIS but require further development and modification. Specificity of the tools also needs to be determined in a prospective cohort of children with stroke and non-stroke brain attacks.

  19. AN ANALYSIS OF THE CAUSES OF PRODUCT DEFECTS USING QUALITY MANAGEMENT TOOLS

    Directory of Open Access Journals (Sweden)

    Katarzyna MIDOR

    2014-10-01

    Full Text Available To stay or strengthen its position on the market, a modern business needs to follow the principles of quality control in its actions. Especially important is the Zero Defects concept developed by Philip Crosby, which means flawless production. The concept consists in preventing the occurrence of defects and flaws in all production stages. To achieve that, we must, among other things, make use of quality management tools. This article presents an analysis of the reasons for the return of damaged or faulty goods in the automotive industry by means of quality management tools such as the Ishikawa diagram and Pareto analysis, which allow us to identify the causes of product defectiveness. Based on the results, preventive measures have been proposed. The actions presented in this article and the results of the analysis prove the effectiveness of the aforementioned quality management tools.

  20. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  1. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  2. Analysis of Phoenix Anomalies and IV & V Findings Applied to the GRAIL Mission

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    NASA IV&V was established in 1993 to improve safety and cost-effectiveness of mission critical software. Since its inception the tools and strategies employed by IV&V have evolved. This paper examines how lessons learned from the Phoenix project were developed and applied to the GRAIL project. Shortly after selection, the GRAIL project initiated a review of the issues documented by IV&V for Phoenix. The motivation was twofold: the learn as much as possible about the types of issues that arose from the flight software product line slated for use on GRAIL, and to identify opportunities for improving the effectiveness of IV&V on GRAIL. The IV&V Facility provided a database dump containing 893 issues. These were categorized into 16 bins, and then analyzed according to whether the project responded by changing the affected artifacts or using as-is. The results of this analysis were compared to a similar assessment of post-launch anomalies documented by the project. Results of the analysis were discussed with the IV&V team assigned to GRAIL. These discussions led to changes in the way both the project and IV&V approached the IV&V task, and improved the efficiency of the activity.

  3. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    Science.gov (United States)

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  4. Multispectral analysis tools can increase utility of RGB color images in histology

    Science.gov (United States)

    Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard

    2018-04-01

    Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.

  5. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    Science.gov (United States)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the

  6. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  7. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    Science.gov (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  9. Situational Awareness Analysis Tools for Aiding Discovery of Security Events and Patterns

    National Research Council Canada - National Science Library

    Kumar, Vipin; Kim, Yongdae; Srivastava, Jaideep; Zhang, Zhi-Li; Shaneck, Mark; Chandola, Varun; Liu, Haiyang; Choi, Changho; Simon, Gyorgy; Eilertson, Eric

    2005-01-01

    .... The University of Minnesota team has developed a comprehensive, multi-stage analysis framework which provides tools and analysis methodologies to aid cyber security analysts in improving the quality...

  10. Risk D and D Rapid Prototype: Scenario Documentation and Analysis Tool

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-01-01

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health and safety risk analysis for decontamination and decommissioning projects. The objective of the Decontamination and Decommissioning (D and D) Risk Management Evaluation and Work Sequencing Standardization Project under DOE EM-23 is to recommend or develop practical risk-management tools for decommissioning of nuclear facilities. PNNL has responsibility under this project for recommending or developing computer-based tools that facilitate the evaluation of risks in order to optimize the sequencing of D and D work. PNNL's approach is to adapt, augment, and integrate existing resources rather than to develop a new suite of tools. Methods for the evaluation of H and S risks associated with work in potentially hazardous environments are well-established. Several approaches exist which, collectively, are referred to as process hazard analysis (PHA). A PHA generally involves the systematic identification of accidents, exposures, and other adverse events associated with a given process or work flow. This identification process is usually achieved in a brainstorming environment or by other means of eliciting informed opinion. The likelihoods of adverse events (scenarios) and their associated consequence severities are estimated against pre-defined scales, based on which risk indices are then calculated. A similar process is encoded in various project risk software products that facilitate the quantification of schedule and cost risks associated with adverse scenarios. However, risk models do not generally capture both project risk and H and S risk. The intent of the project reported here is to produce a tool that facilitates the elicitation, characterization, and documentation of both project risk and H and S risk based on defined sequences of D and D activities. By considering alternative D and D sequences, comparison of the predicted risks can

  11. Forest and fibre genomics: biotechnology tools for applied tree ...

    African Journals Online (AJOL)

    A milestone for eucalypt research, the project will facilitate the development of new biotechnology tools that will accelerate the domestication, improvement and ... The application of DNA fingerprinting in eucalypt breeding programmes represented an early technology delivery to industry with practical, short-term benefi ts, ...

  12. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  13. Applied behavior analysis: understanding and changing behavior in the community-a representative review.

    Science.gov (United States)

    Luyben, Paul D

    2009-01-01

    Applied behavior analysis, a psychological discipline, has been characterized as the science of behavior change (Chance, 2006). Research in applied behavior analysis has been published for approximately 40 years since the initial publication of the Journal of Applied Behavior Analysis in 1968. The field now encompasses a wide range of human behavior. Although much of the published research centers on problem behaviors that occur in schools and among people with disabilities, a substantial body of knowledge has emerged in community settings. This article provides a review of the behavioral community research published in the Journal of Applied Behavior Analysis as representative of this work, including research in the areas of home and family, health, safety, community involvement and the environment, recreation and sports, crime and delinquency, and organizations. In the interest of space, research in schools and with people with disabilities has been excluded from this review.

  14. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  15. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  16. CAVER 3.0: a tool for the analysis of transport pathways in dynamic protein structures.

    Science.gov (United States)

    Chovancova, Eva; Pavelka, Antonin; Benes, Petr; Strnad, Ondrej; Brezovsky, Jan; Kozlikova, Barbora; Gora, Artur; Sustr, Vilem; Klvana, Martin; Medek, Petr; Biedermannova, Lada; Sochor, Jiri; Damborsky, Jiri

    2012-01-01

    Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz.

  17. CAVER 3.0: a tool for the analysis of transport pathways in dynamic protein structures.

    Directory of Open Access Journals (Sweden)

    Eva Chovancova

    Full Text Available Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz.

  18. CAVER 3.0: A Tool for the Analysis of Transport Pathways in Dynamic Protein Structures

    Science.gov (United States)

    Strnad, Ondrej; Brezovsky, Jan; Kozlikova, Barbora; Gora, Artur; Sustr, Vilem; Klvana, Martin; Medek, Petr; Biedermannova, Lada; Sochor, Jiri; Damborsky, Jiri

    2012-01-01

    Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz. PMID:23093919

  19. B. F. Skinner's Contributions to Applied Behavior Analysis

    Science.gov (United States)

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  20. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  1. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  2. Design of the ITER tokamak assembly tools

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hyunki [National Fusion Research Institute, 52 Eoeun-Dong, Yuseong-Gu, Daejon 305-333 (Korea, Republic of)], E-mail: hkpark@nfri.re.kr; Lee, Jaehyuk; Kim, Taehyung [SFA Engineering Corp., 42-7 Palyong-dong, Changwon-si, Gyeongsangnam-do 641-847 (Korea, Republic of); Song, Yunju [National Fusion Research Institute, 52 Eoeun-Dong, Yuseong-Gu, Daejon 305-333 (Korea, Republic of); Im, Kihak [ITER Organization, CEA Cadarasche, 13108 Saint Paul-lez-Durance (France); Kim, Byungchul; Lee, Hyeongon; Jung, Ki-Jung [National Fusion Research Institute, 52 Eoeun-Dong, Yuseong-Gu, Daejon 305-333 (Korea, Republic of)

    2008-12-15

    ITER tokamak assembly is mainly composed of lower cryostat activities, sector sub-assembly, sector assembly, in-vessel activities and ex-vessel activities. The main tools for sector sub-assembly procedures consists of upending tool, sector lifting tool, vacuum vessel support and bracing tool and sector sub-assembly tool. Conceptual design of assembly tools for sector sub-assembly procedures is described herein. The basic structure for upending tool has been developed under the assumption that upending is performed with crane which will be installed in Tokamak building. Sector lifting tool is designed to adjust the position of a sector to minimize the difference between the center of the tokamak building crane and the center of gravity of the sector. Sector sub-assembly tool is composed of special frame for the fine adjustment of position control with 6 degrees of freedom. The design of VV support and bracing tool for four kinds of VV 40 deg. sectors has been developed. Also, structural analysis for upending tool, sector sub-assembly tool has been studied using ANSYS for the situation of an applied load with the same dead weight multiplied by 3/4. The results of structural analyses for these tools were below the allowable values.

  3. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  4. Information contained within the large scale gas injection test (Lasgit) dataset exposed using a bespoke data analysis tool-kit

    International Nuclear Information System (INIS)

    Bennett, D.P.; Thomas, H.R.; Cuss, R.J.; Harrington, J.F.; Vardon, P.J.

    2012-01-01

    with time, for use as a small scale event indicator; a non-parametric time-series component analysis technique (Singular Spectrum Analysis - SSA) for trend identification; and a unique Non-uniform Discrete Fourier Transformation (NDFT) technique that is suited to a non-uniformly sampled time-series input. Specific details of the implementations of these techniques are outlined. As a result of the application of the developed tool-kit a number of easily observable and quantified phenomena are revealed, for example: - The location of a number of small scale anomalous behaviours of potential interest are highlighted; - Frequency, amplitude and phase of highly cyclic sensors are deterministically established; - Long term trends in each sensor series are identified, revealing the residual forms of the sensor records without the long term behaviour superimposed. Re-application of the tool-kit when applied to the residual time series as determined by the initial application further reveals information of potential interest from the dataset. For example, small scale events as indicated by the noise parameterization process and frequency information determined by the NDFT process are less likely to be masked by the long-term variation in the original sensor record. Results of these improvements are also presented within the manuscript. Initial interpretation of the information exposed by the EDA performed through application of the developed tool-kit is presented. The qualitative results, i.e. the event indicators are tentatively associated with experimental procedure or response e.g. changes in noise floor correlated with hydraulic over pressurisation down-hole. The quantitative results, i.e. the frequency information, are used to estimate the effect of environmental conditions on the experimental set-up. While manipulation of a dataset to this extent can expose valuable information useful in further analysis, care must be given to ensure the phenomenon revealed are not a

  5. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    Science.gov (United States)

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  6. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited......This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  7. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  8. X-ray fluorescence spectrometry applied to soil analysis

    International Nuclear Information System (INIS)

    Salvador, Vera Lucia Ribeiro; Sato, Ivone Mulako; Scapin Junior, Wilson Santo; Scapin, Marcos Antonio; Imakima, Kengo

    1997-01-01

    This paper studies the X-ray fluorescence spectrometry applied to the soil analysis. A comparative study of the WD-XRFS and ED-XRFS techniques was carried out by using the following soil samples: SL-1, SOIL-7 and marine sediment SD-M-2/TM, from IAEA, and clay, JG-1a from Geological Survey of Japan (GSJ)

  9. Neutron activation analysis applied to energy and environment

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1975-01-01

    Neutron activation analysis was applied to a number of problems concerned with energy production and the environment. Burning of fossil fuel, the search for new sources of uranium, possible presence of toxic elements in food and water, and the relationship of trace elements to cardiovascular disease are some of the problems in which neutron activation was used. (auth)

  10. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  11. IMPORTANCE OF APPLYING DATA ENVELOPMENT ANALYSIS IN CASE OF HIGHER EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Labas Istvan

    2015-07-01

    Full Text Available Today, the saying predominates better and better according to which a strong target rationalism has to characterize the higher educational institutions due to the scarce resources and the limitlessness of user demands. Now in Hungary, the funding of higher educational system goes through a total transformation thus the leadership has continuously to reckon with the changes of environment and, in tune with these ones, has to modify the goals existing already. Nowadays, it becomes more and more important to measure the effectiveness of the organizations – organizational units pursuing the same or similar activities relative to each other. Benchmarking helps this procedure. Benchmarking is none other than such a tool of analysis and planning which allows comparing the organizations with the best of the competitors. Applying the method with regard to the higher educational institutions is really nothing but a procedure which focuses on comparing processes and results of the institutions’ different functional areas in order to bring to light the opportunities for the rationality as well as the quality and performance improvement. Those benefits could be managed and used as breakthrough possibilities which have been already developed/applied by other organizations and are given by the way leading to a more effective management.The main goal of my monograph is to show a kind of application of Data Envelopment Analysis (DEA method in the higher education. DEA itself is a performance measuring methodology which is a part of benchmarking and uses the linear programming as a method. By means of its application, the effectiveness of different decision-making units can be compared numerically. In our forcefully varying environment, the managerial decision making can be largely supported in each case by such information that is numerically able to identify which organizational units and activities are effective or less effective. Its advantage is that

  12. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  13. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  14. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  15. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  16. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools......, satisfying the process and user constraints. A knowledge base consisting of the process knowledge as well as knowledge on measurement methods and tools has been developed. An ontology has been designed for knowledge representation and management. The developed knowledge base has a dual feature. On the one...... procedures has been developed to retrieve the data/information stored in the knowledge base....

  17. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  18. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    Science.gov (United States)

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  19. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  20. Applied Ecosystem Analysis - - a Primer : EDT the Ecosystem Diagnosis and Treatment Method.

    Energy Technology Data Exchange (ETDEWEB)

    Lestelle, Lawrence C.; Mobrand, Lars E.

    1996-05-01

    The aim of this document is to inform and instruct the reader about an approach to ecosystem management that is based upon salmon as an indicator species. It is intended to provide natural resource management professionals with the background information needed to answer questions about why and how to apply the approach. The methods and tools the authors describe are continually updated and refined, so this primer should be treated as a first iteration of a sequentially revised manual.

  1. Analysis of concrete beams using applied element method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  2. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  3. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  4. Decision-support tools for climate change mitigation planning

    DEFF Research Database (Denmark)

    Puig, Daniel; Aparcana Robles, Sandra Roxana

    . For example, in the case of life-cycle analysis, the evaluation criterion entails that the impacts of interest are examined across the entire life-cycle of the product under study, from extraction of raw materials, to product disposal. Effectively, then, the choice of decision-support tool directs......This document describes three decision-support tools that can aid the process of planning climate change mitigation actions. The phrase ‘decision-support tools’ refers to science-based analytical procedures that facilitate the evaluation of planning options (individually or compared to alternative...... options) against a particular evaluation criterion or set of criteria. Most often decision-support tools are applied with the help of purpose-designed software packages and drawing on specialised databases.The evaluation criteria alluded to above define and characterise each decision-support tool...

  5. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  6. Integrated analysis tools for trade studies of spacecraft controller and sensor locations

    Science.gov (United States)

    Rowell, L. F.

    1986-01-01

    The present investigation was conducted with the aim to evaluate the practicality and difficulties of modern control design methods for large space structure controls. The evaluation is used as a basis for the identification of useful computer-based analysis tools which would provide insight into control characteristics of a spacecraft concept. A description is presented of the wrap-rib antenna and its packaging concept. Attention is given to active control requirements, a mathematical model of structural dynamics, aspects of sensor and actuator location, the analysis approach, controllability, observability, the concept of balanced realization, transmission zeros, singular value plots, analysis results, model reduction, and an interactive computer program. It is pointed out that the application of selected control analysis tools to the wrap-rib antenna demonstrates several capabilities which can be useful during conceptual design.

  7. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  8. International publication trends in the Journal of Applied Behavior Analysis: 2000-2014.

    Science.gov (United States)

    Martin, Neil T; Nosik, Melissa R; Carr, James E

    2016-06-01

    Dymond, Clarke, Dunlap, and Steiner's (2000) analysis of international publication trends in the Journal of Applied Behavior Analysis (JABA) from 1970 to 1999 revealed low numbers of publications from outside North America, leading the authors to express concern about the lack of international involvement in applied behavior analysis. They suggested that a future review would be necessary to evaluate any changes in international authorship in the journal. As a follow-up, we analyzed non-U.S. publication trends in the most recent 15 years of JABA and found similar results. We discuss potential reasons for the relative paucity of international authors and suggest potential strategies for increasing non-U.S. contributions to the advancement of behavior analysis. © 2015 Society for the Experimental Analysis of Behavior.

  9. Bottom-Up modeling, a tool for decision support for long-term policy on energy and environment - The TIMES model applied to the energy intensive industries

    International Nuclear Information System (INIS)

    Djemaa, A.

    2009-01-01

    Among the energy users in France and Europe, some industrial sectors are very important and should have a key role when assessing the final energy demand patterns in the future. The aim of our work is to apply a prospective model for the long range analysis of energy/technology choices in the industrial sector, focussing on the energy-intensive sectors. The modelling tool applied in this study is the TIMES model (family of best known MARKAL model). It is an economic linear programming model generator for local, national or multi regional energy systems, which provides a technology-rich basis for estimating energy dynamics over a long term, multi period time. We illustrate our work with nine energy-intensive industrial sectors: paper, steel, glass, cement, lime, tiles, brick, ceramics and plaster. It includes a detailed description of the processes involved in the production of industrial products, providing typical energy uses in each process step. In our analysis, we identified for each industry, several commercially available state-of-the-art technologies, characterized and chosen by the Model on the basis of cost effectiveness. Furthermore, we calculated potential energy savings, carbon dioxide emissions' reduction and we estimated the energy impact of a technological rupture. This work indicates that there still exists a significant potential for energy savings and carbon dioxide emissions' reduction in all industries. (author)

  10. Molecular polymorphism as a tool for differentiating ground beetles (Carabus species): application of ubiquitin PCR/SSCP analysis.

    Science.gov (United States)

    Boge, A; Gerstmeier, R; Einspanier, R

    1994-11-01

    Differentiation between Carabus species (ground beetle) and subspecies is difficult, although there have been extensive studies. To address this problem we have applied PCR in combination with SSCP analysis focussing on the evolutionally conservative ubiquitin gene to elaborate a new approach to molecular differentiation between species. We report that Carabidae possess an ubiquitin gene and that its gene has a multimeric structure. Differential SSCP analysis was performed with the monomeric form of the gene to generate a clear SSCP pattern. Such PCR/SSCP resulted in reproducible patterns throughout our experiments. Comparing different Carabus species (Carabus granulatus, C. irregularis, C. violaceus and C. auronitens) we could observe clear interspecies differences but no differences between genders. Some species showed some remarkable differences between the individuals. We suggest that the ubiquitin PCR-SSCP technique might be an additional tool for the differentiation of ground beetles.

  11. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  12. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

    Directory of Open Access Journals (Sweden)

    Larisa Preda

    2007-05-01

    Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

  13. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    International Nuclear Information System (INIS)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook

    2007-08-01

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the modeling

  14. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook (and others)

    2007-08-15

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the

  15. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  16. BIOIMPEDANCE VECTOR ANALYSIS AS A TOOL FOR DETERMINATION AND ADJUSTMENT OF DRY WEIGHT IN HEMODIALYSIS PATIENTS

    Directory of Open Access Journals (Sweden)

    Ximena Atilano

    2012-06-01

    Full Text Available The hemodialysis (HD patient is fluid overloaded, even when there is no apparent edema. Due to this, is vital to know the dry weight. No clinical or laboratory parameters are reliable, simple and accessible for this purpose. The bioelectrical impedance has been applied to estimate body fluids and dry weight. The purpose was to use the bioelectrical vector analysis (BIVA as a tool to adjust the intensity of ultrafiltration and achievement of dry weight in HD patients. We performed monthly measurements of bioimpedance in 24 HD patients pre-and post-dialysis for four months. We plotted the patient´s vectors in the RXc graph in order to meet individually hydration status and adjust the dry weight. Nutritional status was evaluated by Bilbrey Index. Adjustment on dry weight, was made in 18 patients, 13 of whom (72% were able to reach it. The postdialysis vectors, migrated to upper quadrants, indicating adequate hydration. Postdialysis vectors at the end of the study were significantly different compared to baseline (Figure 1. Five patients didn´t reach dry weight despite the adjustments, 4 were men with overhydration and 3 of them were severely malnourished. A woman remained dehydrated. In conclusion, the impedance vector analysis is a useful tool for adjusting dry weight in hemodialysis patients.fx1

  17. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    Science.gov (United States)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  18. Multivariate data analysis as a fast tool in evaluation of solid state phenomena

    DEFF Research Database (Denmark)

    Jørgensen, Anna Cecilia; Miroshnyk, Inna; Karjalainen, Milja

    2006-01-01

    of information generated can be overwhelming and the need for more effective data analysis tools is well recognized. The aim of this study was to investigate the use of multivariate data analysis, in particular principal component analysis (PCA), for fast analysis of solid state information. The data sets...... the molecular level interpretation of the structural changes related to the loss of water, as well as interpretation of the phenomena related to the crystallization. The critical temperatures or critical time points were identified easily using the principal component analysis. The variables (diffraction angles...... or wavenumbers) that changed could be identified by the careful interpretation of the loadings plots. The PCA approach provides an effective tool for fast screening of solid state information....

  19. Conception of a PWR simulator as a tool for safety analysis

    International Nuclear Information System (INIS)

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.

    1982-09-01

    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  20. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  1. Mobility analysis tool based on the fundamental principle of conservation of energy.

    Energy Technology Data Exchange (ETDEWEB)

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  2. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2016-06-01

    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  3. Application of computer tools to the diagnosis of the combustion in motors

    International Nuclear Information System (INIS)

    Agudelo S, John R; Delgado M, Alvaro; Gutierrez V, Elkin

    2001-01-01

    This paper describes the fundamental topics concerning to analysis of combustion process in internal combustion engines, when latest generation computational tools are employed. For achieving this, it has been developed DIATERM using graphic programming languages. It is also described the thermo-dynamical model in which is based DIATERM. In the same way it is showed the potential of this computational tool when it is applied to analysis of pressure data in the combustion chamber of a turbo charged diesel engine, changing the load while rotational speed is maintained constant

  4. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  5. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  6. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  7. Application of cleaner production tools and failure modes and effects analysis in pig slaughterhourses

    Directory of Open Access Journals (Sweden)

    J. M. Fonseca

    2017-07-01

    Full Text Available Cleaner production programs (CP and Failure Modes and Effects Analysis (FMEA are tools used to improve the sustainability of industries, ensuring greater profitability, quality, reliability and safety of their products and services. The meat industry is among the most polluting industries because of the large amounts of organic waste produced during meat processing. The objective of this study was to combine the CP and FMEA tools and to apply them in a pig slaughterhouse in order to detect critical points along the production chain that have a major environmental impact and to establish corrective and preventive actions that could minimize these problems. The results showed that water is the most consumed resource by the industry and also the main producer of waste due to microbiological contamination with animal feces and blood and meat residues. All impacts were found to be real due to their daily occurrence in the industry. Their severity, occurrence, detection and coverage were classified as moderate and high, high, low and moderate, and moderate and high, respectively. The application of the CP and FMEA tools was efficient in identifying and evaluating the environmental impacts caused by the slaughter and processing of pork carcasses. Liquid slaughter effluents and solid wastes (blood and bones are the factors that pose the greatest risks to the environment. The substitution of treatment plant chemicals with decomposing microorganisms, composting, and the production of animal meal and feed from solid waste are appropriate measures the industry could adopt to minimize the contamination of water resources and soil.

  8. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  9. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  10. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  11. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  12. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Marshall, N.H.; Marwil, E.S.; Matthews, S.D.; Stacey, B.J.

    1990-01-01

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  13. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  14. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  15. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  16. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  17. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  18. Modal Analysis and Experimental Determination of Optimum Tool Shank Overhang of a Lathe Machine

    Directory of Open Access Journals (Sweden)

    Nabin SARDAR

    2008-12-01

    Full Text Available Vibration of Tool Shank of a cutting tool has large influence on tolerances and surface finish of products. Frequency and amplitude of vibrations depend on the overhang of the shank of the cutting tool. In turning operations, when the tool overhang is about 2 times of the tool height, the amplitude of the vibration is almost zero and dimensional tolerances and surface finish of the product becomes high. In this paper, the above statement is verified firstly by using a finite element analysis of the cutting tool with ANSYS software package and secondly, with experimental verification with a piezoelectric sensor.

  19. LCA-IWM: A decision support tool for sustainability assessment of waste management systems

    International Nuclear Information System (INIS)

    Boer, J. den; Boer, E. den; Jager, J.

    2007-01-01

    The paper outlines the most significant result of the project 'The use of life cycle assessment tools for the development of integrated waste management strategies for cities and regions with rapid growing economies', which was the development of two decision-support tools: a municipal waste prognostic tool and a waste management system assessment tool. The article focuses on the assessment tool, which supports the adequate decision making in the planning of urban waste management systems by allowing the creation and comparison of different scenarios, considering three basic subsystems: (i) temporary storage; (ii) collection and transport and (iii) treatment, disposal and recycling. The design and analysis options, as well as the assumptions made for each subsystem, are shortly introduced, providing an overview of the applied methodologies and technologies. The sustainability assessment methodology used in the project to support the selection of the most adequate scenario is presented with a brief explanation of the procedures, criteria and indicators applied on the evaluation of each of the three sustainability pillars

  20. The Selected Method and Tools for Performance Measurement in the Green Supply Chain—Survey Analysis in Poland

    Directory of Open Access Journals (Sweden)

    Blanka Tundys

    2018-02-01

    Full Text Available The methods and tools for the performance measurement and evaluation of the green supply chain management are very important elements for the construction and function of this type of supply chain. The result is a presentation of the considerations underlying a very general model, which presents some selected tools, but no breakdown of individual industries. The considerations undertaken are important and have scientific added value as usually in practice, a very large number of tools are used to assess the supply chain, which are not always correlated or adapted to the specificity of the chain. It is worth pointing out which of the already used or completely new tools and methods will be most useful for assessing the green supply chain. The structure of the paper covers the theoretical and empirical. It includes an introduction, our goals and hypotheses, state of the art, methodology, empirical findings, and discussion. We present the definitional differences between green and sustainable supply chains and focus on the selection and identification of methods for the framework model for evaluating the green supply chain. In the next step, the theoretical and selected method and tools were compared to a survey of Poland. On the basis of the survey, we present the findings and discussions found in this area. The main methodology used includes a literature review, a survey analysis using a questionnaire and statistical tools. The survey was carried out in 2015 in sample organizations in Poland. The research results showed that organizations were aware of the environmental elements of measuring and assessing the supply chain from an environmental point of view, but their use depended on many factors: the area, size of the organization, or the industry. If certain boundary conditions are met and the organizations are aware of the essence of environmental aspects in the chain, then they are applying green measures to the supply chain. These findings

  1. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  2. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    Energy Technology Data Exchange (ETDEWEB)

    Plott, B. [Alion Science and Technology, MA and D Operation, 4949 Pearl E. Circle, 300, Boulder, CO 80301 (United States)

    2006-07-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  3. Big Data is a powerful tool for environmental improvements in the construction business

    Science.gov (United States)

    Konikov, Aleksandr; Konikov, Gregory

    2017-10-01

    The work investigates the possibility of applying the Big Data method as a tool to implement environmental improvements in the construction business. The method is recognized as effective in analyzing big volumes of heterogeneous data. It is noted that all preconditions exist for this method to be successfully used for resolution of environmental issues in the construction business. It is proven that the principal Big Data techniques (cluster analysis, crowd sourcing, data mixing and integration) can be applied in the sphere in question. It is concluded that Big Data is a truly powerful tool to implement environmental improvements in the construction business.

  4. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  5. EVALUATION TOOL OF CLIMATE POTENTIAL FOR VENTILATIVE COOLING

    DEFF Research Database (Denmark)

    Belleri, Annamaria; Psomas, Theofanis Ch.; Heiselberg, Per Kvols

    2015-01-01

    . Within IEA Annex 62 project, national experts are working on the development of a climate evaluation tool, which aims at assessing the potential of ventilative cooling by taking into account also building envelope thermal properties, internal gains and ventilation needs. The analysis is based on a single......-zone thermal model applied to user-input climatic data on hourly basis. The tool identifies the percentage of hours when natural ventilation can be exploited to assure minimum air change rates required by state of the art research, standards and regulations and the percentage of hours when direct ventilative...

  6. Condition monitoring and signature analysis techniques as applied to Madras Atomic Power Station (MAPS) [Paper No.: VIA - 1

    International Nuclear Information System (INIS)

    Rangarajan, V.; Suryanarayana, L.

    1981-01-01

    The technique of vibration signature analysis for identifying the machine troubles in their early stages is explained. The advantage is that a timely corrective action can be planned to avoid breakdowns and unplanned shutdowns. At the Madras Atomic Power Station (MAPS), this technique is applied to regularly monitor vibrations of equipment and thus is serving as a tool for doing corrective maintenance of equipment. Case studies of application of this technique to main boiler feed pumps, moderation pump motors, centrifugal chiller, ventilation system fans, thermal shield ventilation fans, filtered water pumps, emergency process sea water pumps, and antifriction bearings of MAPS are presented. Condition monitoring during commissioning and subsequent operation could indicate defects. Corrective actions which were taken are described. (M.G.B.)

  7. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  8. An Assessment Tool applied to Manure Management Systems using Innovative Technologies

    DEFF Research Database (Denmark)

    Sørensen, Claus Aage Grøn; Jacobsen, B.H.; Sommer, Sven Gjedde

    2003-01-01

    of operational and cost-effective animal manure handling technologies. An assessment tool covering the whole chain of the manure handling system from the animal houses to the field has been developed. The tool enables a system-oriented evaluation of labour demand, machinery capacity and costs related......In Denmark, stringent new regulations, placing strict time limits on manure application and setting thresholds for nitrogen utilisation, have been imposed in order to increase the efficiency of uptake of plant nutrients from manure. An important factor in meeting these requirements is the use...

  9. Disclosure as a regulatory tool

    DEFF Research Database (Denmark)

    Sørensen, Karsten Engsig

    2006-01-01

    The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law.......The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law....

  10. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  11. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  12. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  13. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  14. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  15. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  16. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Diaz Sanchidrian, C.; Castans, M.

    1989-01-01

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  17. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  18. Environmental management systems tools applied to the nuclear fuel center of IPEN

    International Nuclear Information System (INIS)

    Mattos, Luis A. Terribile de; Meldonian, Nelson Leon; Madi Filho, Tufic

    2013-01-01

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  19. Environmental management systems tools applied to the nuclear fuel center of IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Mattos, Luis A. Terribile de; Meldonian, Nelson Leon; Madi Filho, Tufic, E-mail: mattos@ipen.br, E-mail: meldonia@ipen.br, E-mail: tmfilho@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    This work aims to identify and classify the major environmental aspects and impacts related to the operation of the Nuclear Fuel Center of IPEN (CCN), through a systematic survey data, using interviews questions and consulting of licensing documents and operational records. First, the facility processes and activities, and the interactions between these processes were identified. Then, an analysis of potential failures and their probable causes was conducted to establish the significance of environmental aspects, as well as the operational controls, which are necessary to ensure the prevention of impacts on the environment. The results obtained so far demonstrate the validity of this study as a tool for identification of environmental aspects and impacts of nuclear facilities in general, as a way to achieving compliance with the ISO 14001:2004 standard. Moreover, it can serve as an auxiliary method for resolving issues related to the attendance of applicable regulatory and legal requirements of National Nuclear Energy Commission (CNEN) and Brazilian Institute of Environment (IBAMA). (author)

  20. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics